![]() VIBRATION-INDUCED ERROR CORRECTION FOR TOUCH SCREEN DISPLAY IN AN AIRCRAFT
专利摘要:
Systems (700) and methods (500, 600) for correcting a vibration-induced error on a touch screen display device (118) on an aircraft (100) are provided. A method (500) may include (at 502) providing a user interface (120) for display on a touch screen display device (118). The user interface (120) may include one or more virtual elements (202) associated with functions to be performed by a component (730) of the aircraft (100). The method (500) may include (at 504) receiving data indicative of actual user interaction with the touch screen display device (118). The method (500) may include (at 506) receiving data indicative of a desired user interaction. The method (500) may include (at 508) receiving data indicative of a detected vibration (216). The method (500) may include (at 510) determining a corrected user interaction corresponding to a selected virtual element (202) based on the received data. The method (500) may also include (at 512) sending one or more control signals to one or more components (730) to perform a function associated with the selected virtual element (202). 公开号:FR3052889A1 申请号:FR1755455 申请日:2017-06-16 公开日:2017-12-22 发明作者:George R Henderson;Paul Richard Hickman;Luke Patrick Bolton 申请人:GE Aviation Systems Ltd; IPC主号:
专利说明:
Vibration-induced error correction for touchscreen display in an aircraft The present invention generally relates to the correction of a vibration-induced error for a touch-screen display, and more particularly to the correction of a vibration-induced error for a touch-screen display in an aircraft. During operation of the aircraft, crew members may use touch screen display devices to perform various functions, such as entering flight plan data or verifying surrounding weather conditions. These touch-screen display devices can enable fast and efficient entry and review of data and flight conditions, while condensing the function of multiple components into a single screen. The touch screen display device may display one or more user interfaces, where it is possible to navigate by selecting virtual elements displayed on the touch screen display device with, for example, a finger or a stylus. Nevertheless, during operation, the aircraft may encounter turbulence, gravitational forces, or mechanical vibrations, which may cause crew members to have difficulty controlling their movements. In some cases, the flight conditions can also be extremely difficult and cause a crew member wishing to select a particular item on a touchscreen display device to inadvertently select an unwanted item. The crew member may then have to re-enter the desired entry on the touch screen display device, which may require going back in the navigation on the user interface. Having to re-enter the desired selection on the touchscreen display device can create frustration among crew members and decrease efficiency. Aspects and advantages of the embodiments of the present invention will be set forth in part in the following description, or may be learned from the description, or may be learned by practicing the examples described herein. An exemplary aspect of the present invention is directed to a computer implemented method of correcting a vibration-induced input error on a touch screen display on an aircraft. The method may include providing for display, by one or more processors, a user interface on a touch screen display. The user interface may include one or more interactive virtual elements. Each virtual element may be associated with a function to be performed by a component of the aircraft during operation of the aircraft. The method may also include receiving, by one or more processors, data indicative of actual user interaction with the touch screen display. The method may also include receiving, by one or more processors, data indicative of a desired user interaction. The method may also include receiving, by one or more processors, data indicative of a vibration detected from one or more vibration sensors located on the aircraft. The method may also include determining, by the one or more processors, a corrected user interaction corresponding to at least one selected virtual element based at least in part on the data indicative of the actual user interaction, indicative data. the desired user interaction, and data indicative of the detected vibration. In response to the corrected user interaction, the method may also include sending, by the one or more processors, one or more control signals to one or more components of an aircraft to achieve at least a portion of the function associated with the at least one selected virtual element. Another exemplary aspect of the present invention is a system for correcting a vibration-induced input error on a touch screen display on an aircraft. The system may include a touch screen display device configured to display a user interface. The user interface may include one or more interactive virtual elements. Each virtual element may be associated with a function to be performed by a component of the aircraft during operation of the aircraft. The system may also include one or more vibration sensors located on the aircraft configured to obtain data indicative of a detected vibration. The system may also include one or more tracking devices configured to obtain data indicative of a desired user interaction. The system may also include a control system configured to receive data indicative of actual user interaction with at least one virtual element, receive data indicative of a desired user interaction, receive data indicative of a desired user interaction. detected vibration, determining a corrected user interaction corresponding to at least one selected virtual element based at least in part on data indicative of the actual user interaction, data indicative of the desired user interaction, and data indicative of the detected vibration, and in response to the corrected user interaction, sending one or more control signals to one or more components of the aircraft to perform at least a portion of the task associated with the at least one a selected virtual element. Yet another example of an aspect of the present invention is an aircraft. The aircraft may include an engine. The aircraft may also include a fuselage. The aircraft may also include one or more vibration sensors. The aircraft may also include one or more tracking devices. The aircraft may also include a touch screen display device. The aircraft may also include a command comprising one or more processors and one or more memory devices located on an aircraft. The one or more memory devices may store instructions that when executed by one or more processors cause the one or more processors to perform operations. Operations may include providing for display a user interface on the touch screen display. The user interface may include one or more interactive virtual elements. Each virtual element may be associated with a function to be performed by a component of the aircraft during operation of the aircraft. The operations may also include receiving data indicative of actual user interaction with the touch screen display. The operations may also include receiving data indicative of a desired user interaction from the one or more tracking devices. The operations may also include receiving data indicative of a vibration detected from one or more vibration sensors. The operations may also include determining a corrected user interaction corresponding to at least one selected virtual element based at least in part on the data indicative of the actual user interaction, indicative data of the desired user interaction. , and data indicative of the detected vibration. In response to the corrected user interaction, the operations may also include sending one or more control signals to one or more components of an aircraft to perform at least a portion of the function associated with the at least one selected virtual element. Other exemplary aspects of the present invention are directed to non-transient computer readable systems, methods, aircraft, avionics systems, devices, and media for correcting a vibration-induced error on a touch-screen display device. an aircraft. Variations and modifications can be made to these exemplary aspects of the present invention. This and other features, aspects and advantages of various embodiments will be better known with reference to the following description and accompanying claims. The accompanying drawings, which are incorporated herein and form part of this description, illustrate embodiments of the present invention and, together with the description, serve to illustrate the principles involved. A detailed presentation of the embodiments intended for those skilled in the art is provided in the description, with reference to the accompanying figures, in which: FIG. 1 is a perspective view of an exemplary portion of an aircraft according to exemplary aspects of the present invention; FIG. 2 is a diagram of an example of interaction with a user interface targeted by the present invention; FIG. 3 is a diagram of an example of interaction with a user interface according to exemplary aspects of the present invention; FIG. 4 is a diagram of an example of interaction with a user interface according to exemplary aspects of the present invention; FIG. Figure 5 shows an exemplary method according to exemplary aspects of the present invention; FIG. Figure 6 shows an exemplary method according to exemplary aspects of the present invention; FIG. 7 shows an exemplary system according to exemplary aspects of the present invention. Reference will now be made in detail to embodiments of the invention, of which one or more examples are illustrated in the drawings. Each example is provided as an explanation of the invention, and not to limit the invention. In fact, it will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit of the invention. For example, features illustrated or described as part of one embodiment may be used with another embodiment to implement yet another embodiment. Thus, it is understood that the present invention covers such modifications and variations that appear in the field of the adjoining claims and their equivalents. Exemplary aspects of the present invention are directed to systems and methods for correcting a vibration-induced error with a touch screen display device on an aircraft. Touchscreen displays may be used by crew members to enter and review data and flight conditions during the operation of an aircraft. A user interface may be displayed on the touch screen display device, which may allow a crew member to navigate through various menus by selecting virtual items displayed on the screen. For example, a crew member may enter flight plan information into a flight control system by selecting a virtual item displayed on the touch screen display device corresponding to a flight plan by touching the item. virtual with, for example, a finger or a stylus. Once the virtual flight plan element has been selected, a new menu with additional input and / or information options may be displayed to allow fast and efficient entry of flight plan information. During operation of the aircraft, however, flight conditions may make it difficult for crew members to select a particular virtual item displayed on a touch screen display device. For example, an aircraft may encounter turbulence, gravitational effects, or mechanical vibrations, which may cause a crew member to inadvertently select an item on a touch screen display device that is not not the expected element. In such a case, the crew member may have to re-enter the correct selection, which can cause frustration for the crew member, and decrease the efficiency provided by using a touch screen display device. In addition, if the in-flight vibration conditions continue for a period of time, the crew member may have difficulty re-entering the correct selection, which may further exacerbate the crew member's frustration, and lead to even more decrease in efficiency. Systems and methods according to exemplary aspects of the present invention can correct vibration-induced input errors on a touch screen display device on an aircraft. For example, systems and methods can provide a user interface on a touch screen display device in an aircraft cockpit. The user interface may include one or more virtual elements corresponding to various functions to be performed by one or more components of the aircraft. For example, a user interface may include virtual elements corresponding to a flight plan or weather conditions, which, when selected, may be used to enter flight plan information or display weather conditions, respectively. A crew member can interact with the virtual element (for example, via a touch interaction with, for example, a finger or a stylus) to perform the associated task. A processor may be configured to receive data indicative of actual user interaction with the touch screen display device, such as input by the crew member. For example, when a crew member selects a virtual item by, for example, touching the virtual item on the touchscreen display device with the crew member's finger, the processor may receive indicative data. of the selected virtual element. The processor may be further configured to receive data from one or more vibration sensors located on the aircraft. For example, an accelerometer, a speed sensor, a laser displacement sensor, a proximity sensor, or other vibration sensor may be located in a cockpit of an aircraft. The vibration sensor may be configured to detect a vibration, such as vibration due to turbulence, gravitational force, or mechanical vibrations, and to send data indicative of the detected vibration (for example, data indicative of the duration and the amplitude of the vibration) to the processor. Using this data, the processor can determine whether the interaction of a crew member with a touch screen display device coincides with a detected vibration, which may indicate that an input error may have occurred. location. The processor may further be configured to receive data indicative of a desired user interaction. For example, one or more tracking devices may be configured to track the interaction of the crew member with the touch screen device. For example, in one embodiment, a camera may be configured to track the movement of a crew member's eye, and further configured to provide the processor with data indicative of the object of the visual focus of the crew. member of the crew when the crew member interacts with the touch screen display device. The focus of the crew member's visual focus may indicate that the crew member intended to select an item corresponding to the focus object of the crew member. Using this data, the processor can compute an input error, such as a difference between a real user interaction and the desired user interaction, as determined by the focus object of the crew member. In one embodiment, a tracking device, such as a camera, may be configured to track the movement of an input device used by a crew member to make a selection on the touch screen display device. . For example, an input device may be the finger, hand, or arm of a crew member, or an object held by the crew member, such as a stylus. As used herein, the term "stylus" refers to any object used by a crew member to interact with a touch screen display device, and may include, without limitation, a capacitive stylus, a Wacom digitizer , a Bluetooth compatible stylus, a writing instrument, or any other device used to interact with a touch screen display device. When the crew member interacts with the touch screen display device, the tracking device may be configured to track the movement of the input device. Data indicative of the movement of the input device can then be provided to the processor, which can model the movement of the input device when it interacts with the touch screen display device. Using this data, the processor can calculate a desired user interaction based on the device path and a detected vibration. For example, a processor may be configured to model a user interaction when a detected vibration has not occurred. Using this data, the processor can calculate an input error, such as a difference between an actual user interaction and a desired user interaction, as determined based on the tracked motion of the input device. Using the data indicative of a detected vibration, the processor may be further configured to determine a probability of disturbance. For example, a processor may be configured to determine a probability that an input error was caused by a detected vibration. For example, if an input error is very high, but little or no vibration has been detected before or during the user interaction, a probability of disturbance can be calculated which may indicate a low probability that the Input error was caused by the detected vibration. Nevertheless, if an input error is very high, and a similarly high vibration has been detected before or during the user interaction, a disturbance probability can be calculated that can indicate a high probability that the input error was caused by the vibration detected. Depending on the input error and the probability of disturbance, the processor can determine a corrected user interaction. For example, if a disturbance probability is not high enough that a detected vibration is likely to have caused the input error, the processor can determine that the corrected user interaction is the interaction. real user. Nevertheless, if a probability of disturbance is high enough that a detected vibration is likely to have caused an input error, the processor can determine a corrected user interaction based on the actual user interaction and the input error. For example, a real user interaction may correspond to a first virtual item displayed at a first location on a user interface. Nevertheless, an input error may indicate that a second virtual element located in a second position on the user interface was a desired user interaction. The processor may be configured to determine that the probability of disturbance is high enough that the detected vibration causes the input error, and correct the actual user interaction at the second location corresponding to the second virtual element by adjusting the interaction real user by the input error. The processor can then send one or more control signals to one or more components of the aircraft to perform a function associated with the second virtual element. For example, the processor may send one or more control signals with a remote device, such as a controller located on the aircraft. Alternatively, the one or more control signals may be one or more instructions made by the processor itself. In this manner, systems and methods according to exemplary aspects of the present invention can correct a vibration-induced error on a touch screen display device on an aircraft. The exemplary systems and methods of the present invention can have a technical effect of reducing interaction errors, reducing crew member frustration, and increasing efficiency with respect to interactions with the device. touch screen display. Referring now to the figures, exemplary embodiments of the present invention will be presented in more detail. FIG. 1 is a perspective view of an exemplary portion of an aircraft 100 according to exemplary embodiments of the present invention. The aircraft 100 may include, for example, a cockpit 102, a motor 140, and a fuselage 150. A first user (for example, a first crew member, a pilot) may be present in a seat 104 on the left of the cockpit 102 and another user (for example, a second member of the crew, a co-pilot) may be present on the right side of the cockpit 102 in a seat 106. The aircraft 100 may include a dashboard 108, which may include one or more multifunctional flight display devices 110, which may be one or more touch screen display devices 118. The aircraft may also include one or more instruments 112. In some implementations, the one or more several instruments 112 may be located on the dashboard 108 in front of one or more users and may provide information to assist in piloting the aircraft 100. The aircraft 100 may include one or more physical control interfaces 116. For example, the physical control interface 116 may be a control interface that is configured to adjust a setting, a parameter, a mechanism, and / or a condition of the aircraft 100. The physical control interfaces 116 may include, for example, For example, a button, a momentary push button, a compressible button, a switch mechanism, a sliding control, a level, a button, a gauge, etc. The aircraft 100 may include one or more aircraft input devices 114 (for example, in the cockpit 102) that may be used by one or more users to provide input to one or more processors and interact with the aircraft systems. the aircraft 100. The aircraft input devices 114 may include, for example, any device suitable for accepting input from a user and converting this input to a graphics position on any one of multiple displays. For example, the one or more aircraft input devices 114 may include a joystick, multi-way toggle switches, a mouse, a trackball, a keyboard, a touch screen, a touch keyboard, data input keys, microphone suitable for voice recognition, or any other suitable device. In some implementations, each user may have one or more separate aircraft entry devices 114. By the use of the aircraft input devices 114, the one or more users can interact with the elements of the graphic and / or textual data provided for display on the screens of the display devices 110. One or more user interfaces 120 may be displayed on one or more display devices 110, including one or more touch screen display devices 118. For availability, one or more of the user interfaces 120 may be provided by a display device 110 on each side of the cockpit 108. In some implementations, one or more of the display devices 110 may be touch screen display devices 118 which may allow a user to view the display. user interface 120 on the touch screen display device 118 and to interact with the user interface 120 via the touch screen display device 118. In addition and / or alternatively, one or more of the devices 110 may be operably coupled to the input devices 114 such that a user may interact with the user interface 120 (p for example, a cursor interaction via a trackball, a mouse, etc.) and the textual and / or graphical elements included in the user interface 120. According to exemplary aspects of the present invention, the user interface 120 may include one or more virtual elements displayed on a touch screen display device 118. A user, such as a crew member, may interact with the user. virtual element by real user interaction. In one example, the virtual element may be a visual representation of a push button, and an associated real user interaction may be, for example, a push button interaction on the touch screen display device 118 to realize a function associated with the virtual element. Each virtual element can be associated with a task to be performed by one or more components of the aircraft. A task may include one or more procedures that are required or recommended to perform for the operation of the aircraft 100. The one or more tasks may include procedures that must be performed before, during, and / or after the operation of the aircraft. For example, a task may include turning on a window heater, arming emergency exit lamps, checking the fire alarm, checking the voice recorder, adjusting pressurizing, checking 112 instruments, checking the fuel quantity, releasing a parking brake, lighting a beacon, adjusting the engine antifreeze, adjusting the flaps, locking the cockpit door, turn on the pilot heater, check a transponder, adjust the landing lights, etc. In addition, the one or more tasks may include one or more functions to be performed or displayed by the touch screen display device 118, such as opening a flight plan menu to allow a crew member to enter flight plan information or display weather information. The one or more display devices 110, including one or more touch screen display devices 118, may be configured to communicate by cables and / or without cables to a control system 130. For example, in some applications A touch screen display device 118 may communicate with the control system 130 via a network. The one or more touch screen display devices 118 may be configured to receive actual user interaction with the user interface 120 and to provide data indicative of the actual user interaction to the control system 130. for example, a user may provide touch interaction with one or more virtual elements via a touch screen of a touch screen display device 118. One or more of the touch screen display devices 118 may send data indicative of the actual user interaction with the virtual element to the control system 130. The control system 130 may be configured to receive data indicative of the actual user interaction with the virtual element. A cockpit 102 may also include one or more tracking devices 160. The one or more tracking devices 160 may be configured to track a user interaction with a touch screen display device 118. For example, in a In one embodiment, a tracking device 160 may be a camera configured to track the movement and / or vision of a crew member when the crew member interacts with a touch screen display device 118. will be presented in more detail with respect to FIGs. 3 and 4, a tracking device 160 may be configured to provide data indicative of a desired user interaction to a control system 130. The aircraft 100 may also include one or more vibration sensors 170 located on the aircraft, such as, for example, in the cockpit 102. The one or more vibration sensors 170 may be configured to detect turbulence, gravitational force, or mechanical vibration. The one or more vibration sensors may be one or more accelerometers, speed sensor, laser displacement sensors, proximity probes, or any other vibration sensor configured to sense a vibration. The control system 130 may further be configured to receive data indicative of a vibration detected from the one or more vibration sensors 170 located on the aircraft 100. In response to receiving data indicative of actual user interaction with a touch screen display device 118, data indicative of a desired user interaction from one or more tracking devices 160, and data indicative of a vibration detected from one or more vibration sensors 170, the control system 130, and more particularly, a processor in the control system 130, can be configured to determine a corrected user interaction corresponding to at least a virtual element selected on a user interface 120 displayed on a touch screen display device 118. The corrected user interaction may be a real user interaction corrected for a vibration induced error. In response to determining the corrected user interaction, the control system 130 may be configured to send one or more signals (e.g., control signals) to one or more aircraft components (not shown) for perform the task associated with the selected virtual element. The control system 130 may be in cable communication or without cables to the one or more aircraft components. In addition, and / or alternatively, the control system 130 may be configured to communicate with the aircraft components via a network. The one or more aircraft components may be one or more controllers configured to adjust, adjust, and / or otherwise change a parameter, condition, mechanism, and / or condition of an aircraft. In addition, the one or more aircraft components may include one or more touch screen display devices 118, and the task may be an interaction with a user interface 120 displayed on one or more touch screen display devices. 118, such as navigating through one or more menus for entering or accessing data or information displayed on the touch screen display device 118. For example, a task may include opening a flight plan menu on a touch screen device. 118 touch screen display to allow a crew member to enter flight plan information. In response to receiving one or more control signals, the aircraft components may perform a control action to perform at least part of the task. For example, in response to receiving one or more control signals to display a flight plan menu, the touch screen display device 118 may then display the current flight plan for the aircraft 100. Similarly, corrected user interaction with a touch screen display device 118 may be used to control one or more controllers to perform a task associated with the operation of an aircraft, such as, for example, setting a pressurization setting of cabin or light a cabin heater unit. Referring now to FIG. 2, a diagram of an example of interaction with a user interface is provided. As shown, a user interface 120 is displayed on a touch screen display device 118. As shown, the user interface 120 may include one or more virtual elements 202. For example, as shown, four virtual elements 202A-D are shown. Each virtual element 202 can be associated with a function to be performed during the operation of the aircraft 100. For example, the virtual element 202A can be associated with a flight plan, which, when it is selected, can allow the entry and reviewing a flight plan information. The virtual element 202B may be associated with a terrain map, which, when selected, may allow a graphical representation of the surrounding terrain conditions. The virtual element 202C may be associated with meteorological conditions, which, when selected, may allow weather conditions to be displayed. The virtual element 202D may be associated with air traffic conditions, which, when selected, may allow the display of surrounding air traffic, such as another aircraft in the vicinity. Referring again to FIG. 2, an input device 204 may be used to interact with a user interface 120 displayed on a touch screen display device 118. For example, as shown in FIG. 2, a user interacts with a user interface by touching one or more virtual elements with a user's finger. In another embodiment, an input device 120 may be a hand, an arm, or a stylus. As shown, a user may intend to interact with a particular virtual element 202 displayed on a user interface 120. For example, as shown, a line A corresponding to a desired user interaction is shown. As shown, the user intends to interact with the virtual element 202A, which corresponds to a flight plan. Nevertheless, actual user interaction with a user interface 120 may be impacted by vibration, such as vibration due to turbulence, gravitational forces, or mechanical vibration. For example, the actual user interaction, as represented by the line B, may correspond to an unwanted virtual element, such as the virtual element 202B, which corresponds to a terrain map as shown in FIG. 2. In such a case, actual user interaction may differ from the desired user interaction due to vibration induced error. Referring now to FIG. 3, a diagram of an exemplary interaction with a user interface according to exemplary aspects of the present invention is provided. As shown, a user interface 120 is displayed on a touch screen display device 118. As shown, a user 206 has a visual focus object 208 in the user's field of view. The object of the visual focus 208 may correspond to one or more virtual elements, such as the virtual elements 202A-D shown in FIG. 2. As shown by the line A, the user 206 intends to interact with the object of the visual focus 208. Nevertheless, as shown by the line B, the actual user interaction with the user interface 120 takes place at an erroneous location 210, where the user input device 204 (i.e., the user's finger) touches the touch screen display device 118. The screen display device Touch 118 may be configured to provide data indicative of actual user interaction to a processor in a control system, such as a processor in a control system 130 shown in FIG. 1. For example, the data indicative of the actual user interaction may be data corresponding to the erroneous location 210. According to exemplary aspects of the present invention, a tracking device 160 may be used to obtain data indicative of a desired user interaction with a touch screen display device 118. For example, a tracking device 160 may be a camera configured to track the view of a user. In another embodiment, a tracking device 160 may be any device or system configured to track a user's visual focus object 208. The tracking device 160 may be configured to provide data indicative of a user's visual focus object 208 to a processor, such as a processor in a control system 130 shown in FIG. 1. For example, the tracking device 160 may provide a processor with data indicative of a visual focus object 208, and the processor may be configured to determine the user's visual focus object 208 as a function of data provided. The processor, like a processor in a control system 130, may be further configured to determine an input error 212 based on the data received by the touch screen display device 118 and the tracking device 160. For example , a user may intend to interact with a particular virtual element 202 on a touch screen display device 118, and in order to interact with the virtual element 202, the user can focus on the virtual element When the user interacts with the touch screen display device 118, therefore, the data indicative of a focus object 208 may be data indicative of a desired user interaction. Nevertheless, due to vibration, such as turbulence, the user 206 may interact with the touch screen display device at an erroneous location 210. The processor, such as a processor in a control system 130, may be configured to determine an input error 212 based at least in part on the data indicative of the desired user interaction. The input error 212 may be, for example, a difference between a real user interaction (i.e., an erroneous location 210) on a touch screen display device 118 and an object of the visual focus. 208 of the user 206 (i.e., a desired user interaction). Referring now to FIG. 4, a diagram of an exemplary interaction with a user interface according to exemplary aspects of the present invention is provided. As shown, a user interface 120 is displayed on a touch screen display device 118. As shown, an input device 204, such as a user's finger, is initially on a path corresponding to a line A during interaction of the user with the touch screen display device 118 before a detected vibration 216 takes place. According to exemplary aspects of the present invention, a tracking device 160 may be used to obtain data indicative of a desired interaction of the user with a touch screen display device 118. For example, a display device tracking 160 may be a camera configured to track the movement of an input device 204, such as a user's finger, during the interaction of a user with a touch screen display device 118. In another mode In one embodiment, a tracking device 160 may be any other device or system configured to track the movement of an input device 204, such as a wireless system configured to track the movement of a stylus when the stylet interacts with a device. The tracking device 160 may be configured to obtain data indicative of a desired user interaction, such as the movement of an input device. e 204 when the input device 204 interacts with the touch screen display device 118. The tracking device 160 may be further configured to provide data indicative of a desired user interaction, such as movement of an input device 204 to a processor, such as a processor in a control system 130 shown in FIG. 1. As shown in FIG. 4, as the input device 204 moves to the touch screen display device 118, a detected vibration 216 takes place. When the sensed vibration 216 takes place, the actual motion of the user corresponds to the line B, rather than the line A. The sensed vibration 216 may be a vibration, such as turbulence, gravitational force, or mechanical vibration. The detected vibration 216 may be detected by one or more vibration sensors 170 located on the aircraft. The vibration sensors 170 can obtain data indicative of the detected vibration, such as data concerning the time, duration, and amplitude of the detected vibration 216, and provide the data indicative of the detected vibration 216 to a processor, such as a processor in a control system 130. As shown in FIG. 4, as a result of the detected vibration 216, a real user interaction, or input, received by the input device 204 takes place at the erroneous location 210 on the touch screen display device 118. touch screen display device 118 may be configured to provide data indicative of the user interaction, or actual input, as data corresponding to the erroneous location 210 received by the input device 204, to a processor . The processor, as a processor in a control system 130, may be configured to model an effect of the detected vibration 216 on the input device 204. For example, using data indicative of the actual user interaction, as data corresponding to an erroneous location 210, and data indicative of a desired user interaction, such as data indicative of the motion of the input device 204 when it interacts with a touch screen display device 118 obtained by a tracking device 160, the processor can determine a modeled input 214 when the detected vibration 216 has not occurred. For example, the processor may determine that, when the detected vibration has not occurred, the input device 204 may have moved along a line A to interact with the touch screen display device 118 at a location corresponding to the modeled input 214. As shown, the line A corresponding to a desired user input is a substantially straight line when the input device 204 interacts with the touch screen display device 118. , a model of a user interaction does not need to be a straight line model, but rather, can be any pattern configured to determine a modeled input 214 when a detected vibration 216 has not occurred. Depending on the location of the modeled input 214 and the erroneous location 210, the processor may be further configured to determine an input error 212. For example, the input error may be a difference between user interaction, or actual input, received at an erroneous location 210 and the location of the modeled input 212 when the detected vibration 216 has not occurred. In one embodiment, the one or more tracking devices 160 may be further configured to determine an object of a user's visual focus, such as an object of a visual focus 208 of a user 206 as shown. in FIG. 3, in addition to tracking the movement of an input device 204. The data indicative of a desired user interaction provided to the processor may be data indicative of the object of the visual focus 208 of the user 206. The processor may be configured to determine the input error based on determining a difference between the data indicative of the user interaction, or actual input, as an erroneous location 210, and indicative data of the user. object of the visual focus of the user, such as the object of the user's visual focus 208 as shown in FIG. 3. Referring now to FIG. 5, a schematic diagram of an exemplary method (500) according to exemplary embodiments of the present invention is shown. The method (500) may be implemented by one or more processors, such as a processor in the control system 130 shown in FIG. 1. In addition, FIG. 5 represents steps performed in a particular order for purposes of illustration and presentation. Those skilled in the art, using the descriptions provided herein, will understand that the various steps of all the methods described herein can be modified, adapted, extended, rearranged and / or omitted in various ways without departing from the scope of the present invention. present invention. In (502), method 500 may include providing for display a user interface on a touch screen display device, the user interface including one or more interactive virtual elements, each virtual element being associated with a function of realized by a component of the aircraft during operation of the aircraft. For example, a user interface 120 may be provided for display on a touch screen display device 118. The user interface 120 may include one or more virtual elements, such as the virtual elements 202A-D shown in FIG. 2. Each virtual element may be associated with a function to be performed by a component of the aircraft during the operation of the aircraft, such as opening a flight plan menu to allow input of plan information. flight, or viewing current surrounding weather conditions. In (504), method 500 may include receiving data indicative of actual user interaction with the touch screen display device. For example, a processor in the control system 130 may receive data indicative of actual user interaction with a touch screen display device 118, such as an interaction at an erroneous location 210. User interaction actual may be a touch interaction on the touch screen display device 118, which may be received by an input device, such as a finger, a hand, an arm or a stylus. The touch screen display device 118 may send data indicative of the actual user interaction to the processor of a control system 130 and the processor of the control system 130 may receive the data from the display device to touch screen 118. In (506), the method may include receiving data indicative of a desired user interaction. For example, a tracking device 160 may be configured to track a user when the user interacts with the touch screen display device 118. For example, the tracking device 160 may track a user's visual focus object. as a visual focus object 208 shown in FIG. 3. In addition, the tracking device 160 may be configured to track the movement of an input device, such as an input device 204 as shown in FIG. 4. The tracking device 160 may be, for example, a camera. The data indicative of the desired user interaction may be data indicative of the user's focus object 208 of the user 206, and / or the data indicative of the desired user interaction may be indicative data of the user. movement of the input device 204. The processor of the control system 130 can receive the data indicative of the desired user interaction. In (508), the method may include receiving data indicative of a detected vibration. For example, a vibration sensor 170 may be located on the aircraft, and may be configured to detect vibration on the aircraft, such as turbulence, gravitational force, or mechanical vibration. For example, a detected vibration may be a detected vibration 216 as shown in FIG. 4. The vibration sensor 170 may be configured to provide data indicative of the vibration detected to the processor in the control system 130. The processor of the control system 130 may receive the data indicative of the detected vibration. In (510), the method may include determining a corrected user interaction corresponding to at least one selected virtual element based at least in part on the data indicative of the actual user interaction, data indicative of the interaction desired user, and data indicative of the detected vibration. For example, the processor may determine that a detected vibration has caused the actual user interaction to be different from a desired user interaction due to a detected vibration. The processor may correct the user interaction into a corrected user interaction corresponding to the desired user interaction. The corrected user interaction may correspond to at least one virtual element, such as a virtual element 204A-D, displayed on a user interface 120 on a touch screen display device 118. Referring now to FIG. 6, a method (600) according to exemplary embodiments of the present invention is shown. The method (600) can be used, for example, to determine a corrected user interaction, such as a corrected user interaction determined by a processor (510) in a method (500). In (602), the method may include determining an input error. For example, an input error may be based at least in part on data indicative of a desired user interaction received by a processor in a control system 130. Data indicative of a desired user interaction may be data received from one or more tracking devices, such as one or more tracking devices 160 shown in FIGS. 2 and 3. The tracking devices may be, for example, one or more cameras. In addition, the one or more tracking devices, such as one or more tracking devices 160, may be configured to track an object of a visual focus of the user, as shown in FIG. 3. The processor may be configured to determine the input error based at least in part on the data provided by the one or more tracking devices. For example, a tracking device 160 may track an object of a visual focus 208 of a user 206. The tracking device 160 may provide data indicative of the user's visual focus object 208 of the user 206 to the processor. The processor may also receive data indicative of actual user interaction, such as erroneous location indicative data received from a touch screen display device 118. The processor may be configured to determine an error of input, as an input error 212 corresponding to a difference between the object of the visual focus 208 and the erroneous location 210. Additionally and / or alternatively, the one or more tracking devices may be configured to track the movement of an input device, such as an input device 204 shown in FIG. 4. The processor may be configured to receive data indicative of the movement of the input device, and to model an input when a detected vibration has not occurred. For example, a processor may determine a modeled input 214 when a detected vibration 216 has not occurred based on data provided by a tracking device 160. The processor may be further configured to determine an input error based on of the modeled input and data indicative of real user interaction, such as an erroneous location 210. For example, an input error may be a difference between the modeled input 214 and the erroneous location 210. In (604), the method may include determining a probability of disturbance. The probability of disturbance may be a probability that an input error was caused by a detected vibration. For example, a processor may receive data indicative of a detected vibration, such as data received by one or more vibration sensors 170. The data indicative of a detected vibration may be, for example, data indicative of amplitude. of a vibration, the moment when a vibration begins, and the duration of the vibration. Based on the data indicative of the detected vibration, the processor can determine a probability that an input error has been caused by the detected vibration. For example, if a high amplitude detected vibration occurs at a time coinciding with a large input error, the processor can determine that there is a high probability that the input error was caused by the detected vibration. . In addition, if little or no vibration occurred during a user interaction in which a large input error is determined, a processor may determine that there is a low probability that the input error has occurred. was caused by the vibration detected. In (606), the method may include determining whether a disturbance probability is high enough that a detected vibration has caused an input error. For example, if a probability of disturbance is low and an input error is high, the processor may determine that the input error is not likely caused by a detected vibration. In such a case, the processor may disregard the input error, and determine that the corrected user interaction is the actual user interaction received by a touch screen display device 118. Therefore, the processor can use the original input, or actual user interaction, as shown in (608). In this way, a corrected user interaction may be at least partly a function of the probability of disturbance. If, nevertheless, the probability of disturbance is sufficiently high for a vibration detected to have caused an input error, in (610), the processor can determine whether the function corresponding to the real user interaction is a critical function of security. For example, a real user interaction may correspond to the input of a flight plan parameter that would avoid a collision with another aircraft. A desired user interaction, in contrast, may correspond to the input of a flight plan parameter that would not avoid the collision with another aircraft. In such a case, the processor may determine that the actual user interaction corresponds to a security critical function, and may disregard the input error, and determine that the corrected user interaction is the actual user interaction received by a touch screen display device 118. Therefore, the processor may use the original input, or actual user interaction, as shown in (612). In this manner, a corrected user interaction may be the data indicative of the actual user interaction when the data indicative of the actual user interaction includes the selection of a virtual item that includes a security critical function. . If, nevertheless, the function corresponding to the real user interaction is not a critical security function, in (614), the processor can adjust the actual user interaction, or input, by the error of 'Entrance. For example, as shown in FIG. 3, an input error 212 may correspond to a difference between an erroneous location 210 and an object of a visual focus 208 of a user 206. The processor may determine that the user 206 intended to interact with the touch screen display device 118 at a location corresponding to the object of the visual focus 208. Therefore, the processor can adjust data indicative of the actual user interaction corresponding to the erroneous location 210 by the input error 212. Thus, the corrected user interaction may be a location on the touch screen display device 118 corresponding to the object of the visual focus 208. In addition, the corrected user interaction may be be a virtual item, such as a virtual item 204A-D corresponding to the location of the visual focus object 208. Similarly, the corrected user interaction may correspond to a virtual item corresponding to the input Modeled 214, as shown in FIG. 4. In this manner, the corrected user interaction may be a virtual item selected on the touch screen interface corresponding to the location of the data indicative of the actual user interaction adjusted by the input error. . Referring again to FIG. 5, in response to determining a corrected user interaction at (510), (512), the method (500) may include sending one or more control signals to one or more aircraft components for perform at least part of the task associated with the at least one selected virtual element. For example, the processor in a control system 130 may, in response to the corrected user interaction, send one or more control signals to one or more aircraft components to perform at least part of an associated task to the selected virtual element. For example, a selected virtual element may be a virtual element associated with a flight plan, such as the virtual element 202A shown in FIG. 2. The processor may send one or more control signals to a touch screen display device 118 to open a menu corresponding to a flight plan to allow a crew member to enter or review information. flight plan. Thus, as shown in FIG. 2, when the actual user interaction has corresponded to a virtual element 202B, the processor can correct the user interaction, and perform a function corresponding to the selected virtual element 202A. In this manner, the method (500) according to exemplary aspects of the present invention can correct a vibration-induced error with a touch-screen display device. FIG. 7 shows an exemplary system 700 according to exemplary embodiments of the present invention. As shown, the system 700 may include a control system 710 and one or more display devices 720. The control system 710 may correspond to the control system 130 as described in exemplary embodiments of the present invention. The one or more display devices 720 may correspond to one or more touch screen display devices 118 as described in exemplary embodiments of the present invention. The control system 710 may include one or more processors 712 and one or more memory devices 714. The control system 710 may also include a network interface used to communicate with the display 720 and / or one or more components. aircraft 730 on a network 740. The one or more components 730 may include one or more aircraft control devices, one or more display devices 110, and one or more touch screen display devices 118. L The network interface may include any components suitable for interfacing with one or more networks, including, for example, transmitters, receivers, ports, controls, antennas, or other suitable components. The network 740 may include a data bus or a combination of wired and uncabled communication links. The network 740 can be any type of communications network, such as a local area network (for example an intranet), an enlarged network (for example the Internet), a cellular network, or a combination of these. The one or more processors 712 may include any suitable processing device, such as a microprocessor, a microcontroller, an integrated circuit, a logic device, or other suitable processing device. The one or more memory devices 714 may include one or more computer readable media, including, but not limited to, non-transient readable media, RAM, ROM, hard disks, flash memories, or other memory devices. The one or more memory devices 714 may store information accessible by one or more processors 712, including computer readable instructions 716 that may be executed by one or more processors 712. The instructions 716 may be any set of instructions that when executed by one or more processors 712, cause the one or more processors 712 to perform operations. As shown in FIG. 7, the one or more memory devices 714 can also store data 718 that can be retrieved, manipulated, created, or stored by one or more processors 712. The data 718 can include, for example, data associated with elements virtual tasks, tasks, control actions, and / or control devices associated with the aircraft. The data 718 can be stored in one or more of the databases. One or more of the databases may be connected to the control system 710 by a high bandwidth LAN or WAN, or may also be connected to the control system 710 via a network 740. or more than one database may be separated in such a way that they are located in multiple locations. The display device 720 may include one or more processors 722 and one or more memory devices 724. The display device 720 may also include a network interface used to communicate with the control system 710 and / or one or more aircraft control devices 710 on a network 740. The network interface may include any components suitable for interfacing with one or more networks, including for example, transmitters, receivers, ports, controls, antennas, or other suitable components. The one or more processors 722 may include any suitable processing device, such as a microprocessor, a microcontroller, an integrated circuit, a logic device, or other suitable processing device. The one or more memory devices 724 may include one or more computer readable media, including, but not limited to, non-transient readable media, RAM, ROM, hard disks, flash memories, or other memory devices. The one or more memory devices 724 may store information accessible by one or more processors 722, including computer readable instructions 726 that may be executed by one or more processors 722. The instructions 726 may be any set of instructions that when executed by one or more processors 722, cause the one or more processors 722 to perform operations. The instructions 726 may be executed by one or more processors 722 to display, for example, a user interface on an I / O device 727. The instructions 726 and / or other programs executed by the one or more processors 722 may allow a user to perform functions on a screen surface such as to provide user interactions (eg, key, cursor) with virtual elements and to input commands and other data through the surface of the user. the screen. The one or more memory devices 724 may also store data 728 that can be retrieved, manipulated, created, or stored by one or more processors 722. The data 148 may include, for example, data associated with a user interface, a checklist of operations, checklist elements, tasks, virtual elements, and / or other information associated with it. The data 728 can be stored in one or more of the databases. One or more of the databases may be connected to the display 720 by a high bandwidth LAN or WAN, or may also be connected to the display 720 via the network 740. or more than one of the databases may be separated so that they are located in multiple locations. The display device 720 may include an input / output device 727. The input / output device 727 may include a touch screen surface. Such a touch screen surface can take any convenient form including that of a liquid crystal display (LCD) and can use various physical or electrical attributes to detect a user's inputs and interactions. The input / output device 727 may also include a trackball, a mouse, other slider device, a touch keyboard, data input keys, a microphone suitable for voice recognition, and / or other devices suitable entry. The input / output 727 may also include other suitable output devices, such as other audio or visual outputs suitable for indicating the elements of a user interface (for example, tasks associated with checklist elements). The system 700 may include one or more vibration sensors 170. The one or more vibration sensors 170 may be configured to provide data indicative of a detected vibration to the control system 710, as shown herein. The system 700 may also include one or more tracking devices 160. The one or more tracking devices may be configured to provide data indicative of a desired user interaction as presented herein. The technology presented here refers to computer-based systems, as well as actions taken by and information sent to and by computer-based systems. Those skilled in the art will recognize that the inherent flexibility of computer-based systems allows for a wide variety of configurations, combinations, and divisions of tasks and functionality between and among the possible components. For example, the processes presented here can be implemented using a single computing device or multiple computing devices working in combination. Databases, memories, instructions, and applications can be implemented on a single system or distributed among multiple systems. Distributed components can operate sequentially or in parallel.
权利要求:
Claims (20) [1" id="c-fr-0001] A computer implemented method for correcting a vibration-induced input error on a touch-screen display device on an aircraft, comprising: providing for display, by one or more processors, an interface user on a touch screen display device, the user interface comprising one or more interactive virtual elements, each virtual element associated with a function to be performed by a component of the aircraft during operation of the aircraft; receiving, by the one or more processors, data indicative of actual user interaction with the touch screen display device; receiving, by the one or more processors, data indicative of a desired user interaction; to receive, by the one or more processors, data indicative of a vibration detected from one or more vibration sensors located on the aircraft, to determine, by the one or more processors, a corrected user interaction corresponding to at least one virtual element selected based at least in part on data indicative of the actual user interaction, data indicative of the desired user interaction, and data indicative of the detected vibration; and in response to the corrected user interaction, sending, by the one or more processors, one or more control signals to one or more components of an aircraft to perform at least a portion of the function associated with the at least one selected virtual element. [2" id="c-fr-0002] The method of claim 1, wherein determining, by the one or more processors, a corrected user interaction comprises determining an input error based at least in part on the data indicative of the desired user interaction. . [3" id="c-fr-0003] The method of claim 2, wherein the data indicative of the desired user interaction comprises data from one or more tracking devices. [4" id="c-fr-0004] The method of claim 3, wherein the one or more tracking devices comprise one or more cameras. [5" id="c-fr-0005] The method of claim 3 or 4, wherein the one or more tracking devices are configured to track an object of a visual focus of the user, wherein the data indicative of the desired user interaction includes indicative data of the object of the visual focus of the user. [6" id="c-fr-0006] The method of claim 5, wherein determining the input error comprises determining a difference between the data indicative of the actual user interaction and the indicative data of the object of the user's visual focus. [7" id="c-fr-0007] The method of any one of claims 3 to 6, wherein the data indicative of the actual user interaction comprises an input on the touch screen display device received by an input device, wherein one or more tracking devices are configured to track a movement of the input device, wherein data indicative of the desired user interaction comprises data indicative of the movement of the input device. [8" id="c-fr-0008] The method of claim 7, wherein the input device comprises one or more of a finger, a hand, an arm, and a stylet. [9" id="c-fr-0009] The method of claim 7 or 8, wherein determining the input error comprises modeling an effect of the detected vibration on the input device, wherein the input error comprises a difference between the input and a modeled input when the detected vibration has not occurred. [10" id="c-fr-0010] The method of claim 9, wherein the one or more tracking devices are further configured to determine an object of a visual focus of the user, wherein the data indicative of the desired user interaction includes in in addition to data indicative of the object of the user's visual focus, in which determining the input error further comprises determining a difference between the data indicative of the actual user interaction and the indicative data of the user. object of the user's visual focus. [11" id="c-fr-0011] The method according to any one of claims 2 to 10, wherein determining, by one or more processors, a corrected user interaction comprises determining a disturbance probability, wherein the probability of disturbance comprises a probability that the Input error was caused by the detected vibration, in which the corrected user interaction is at least partly a function of the probability of disturbance. [12" id="c-fr-0012] The method of any one of claims 2 to 11, wherein the corrected user interaction comprises selecting a virtual item on the touchscreen display device corresponding to the location of the data indicative of the interaction. user's user adjusted by the input error. [13" id="c-fr-0013] The method of any one of claims 2 to 12, wherein the corrected user interaction comprises the indicative data of the actual user interaction when the indicative data of the actual user interaction comprises a selection. a virtual element that includes a security critical function. [14" id="c-fr-0014] The method of any one of the preceding claims, wherein the detected vibration comprises at least one of a turbulence, a gravitational force, or a mechanical vibration. [15" id="c-fr-0015] A system for correcting a vibration-induced input error on a touch screen display device on an aircraft, comprising: a touch screen display device configured to display a user interface, the user interface comprising a or several interactive virtual elements, each virtual element associated with a function to be performed by a component of the aircraft during operation of the aircraft; one or more vibration sensors located on the aircraft configured to obtain data indicative of a detected vibration; one or more tracking devices configured to obtain data indicative of a desired user interaction; and a control system configured to receive data indicative of actual user interaction with at least one virtual element, receive data indicative of a desired user interaction, receive data indicative of a detected vibration, determine a corrected user interaction corresponding to at least one virtual element selected based at least in part on the data indicative of the actual user interaction, data indicative of the desired user interaction, and data indicative of the detected vibration, and in response to the corrected user interaction, sending one or more control signals to one or more components of the aircraft to perform at least a portion of the task associated with the at least one selected virtual element . [16" id="c-fr-0016] The system of claim 15, wherein determining a corrected user interaction comprises determining an input error based at least in part on the data indicative of the desired user interaction. [17" id="c-fr-0017] The system of claim 16, wherein the one or more tracking devices are configured to determine an object of a user's visual focus, wherein the indicative data of the desired user interaction includes indicative data. of the user's visual focus object, wherein determining the input error includes determining a difference between the data indicative of the actual user interaction and the indicative data of the object of the visual focus of the user. the user. [18" id="c-fr-0018] The system of claim 16 or 17, wherein the data indicative of the actual user interaction comprises an input on the touch screen display device received by an input device, wherein the one or more devices of tracking are configured to track a movement of the input device, wherein determining the input error comprises modeling an effect of the vibration detected on the input device, wherein the input error comprises a difference between the input and a modeled input when the detected vibration has not occurred. [19" id="c-fr-0019] Aircraft, comprising: an engine; a fuselage; one or more vibration sensors; one or more tracking devices; a touch screen display device; and a command comprising one or more processors and one or more memory devices located on the aircraft, the one or more memory devices storing instructions which when executed by one or more processors cause the one or more processors to perform operations comprising providing for display a user interface on the touch screen display device, the user interface comprising one or more interactive virtual elements, each virtual element associated with a function to be performed by a component of the aircraft during operation the aircraft; receiving data indicative of actual user interaction with the touch screen display device; receiving data indicative of a desired user interaction from the one or more tracking devices; receiving data indicative of a vibration detected from one or more vibration sensors; determining a corrected user interaction corresponding to at least one selected virtual element based at least in part on the data indicative of the actual user interaction, data indicative of the desired user interaction, and indicative data. the detected vibration; and in response to the corrected user interaction, sending one or more control signals to one or more aircraft components to perform at least a portion of the function associated with the at least one selected virtual element. [20" id="c-fr-0020] The aircraft of claim 19, wherein determining a corrected user interaction further comprises determining an input error based at least in part on the data indicative of the desired user interaction.
类似技术:
公开号 | 公开日 | 专利标题 FR3052889B1|2019-11-08|VIBRATION-INDUCED ERROR CORRECTION FOR TOUCH SCREEN DISPLAY IN AN AIRCRAFT FR3039643A1|2017-02-03|HUMAN-MACHINE INTERFACE FOR THE FLIGHT MANAGEMENT OF AN AIRCRAFT EP3187826A1|2017-07-05|Display of meteorological data in an aircraft EP1600733B1|2007-02-21|Method and apparatus to provide a flight path to an aircraft EP2645066B1|2018-07-11|Display system for an aircraft and associated method US9050891B2|2015-06-09|Articulating instrument cluster FR2917223A1|2008-12-12|Aircraft e.g. Boeing 747, guiding assisting device for airport, has generation unit generating safety set point from topological element, and presentation unit presenting element or associated set point on interface to pilot of aircraft EP2063227B1|2010-05-19|Virtual control panel for aeronautical attitude control units FR3064762A1|2018-10-05|MANAGEMENT OF THE DOWNHILL PHASE OF AN AIRCRAFT TW201435675A|2014-09-16|System and method for interacting with a touch screen interface utilizing an intelligent stencil mask EP2188595A1|2010-05-26|Device for presenting and selecting data on a display screen FR2998959A1|2014-06-06|METHOD FOR DISPLAYING AN AERONAUTICAL FLIGHT PLAN COMPRISING A FLIGHT DATA PARAMETERING STEP FR2976395A1|2012-12-14|AIDING SYSTEM FOR AIRCRAFT AND AIRCRAFT. EP3173924A1|2017-05-31|Method for using an aircraft human-machine interface device having a speech recognition unit EP3767240A1|2021-01-20|Electronic display device for an all-screen dashboard of a vehicle, associated vehicle and display method EP3029420B1|2020-01-22|Synthetic display system comprising means for adapting the displayed landscape EP3074842B1|2018-01-10|Drone control station EP2813920B1|2017-10-11|A system and method for volumetric computing FR3049375A1|2017-09-29|ELECTRONIC CHECKLISTS WITH DYNAMIC VISIBILITY OF ANNOTATIONS FR2957704A1|2011-09-23|Computer test simulator e.g. electronic controller, for fascia of A380 airbus, has generating unit generating displacement setpoint to simulate setpoint of mouse, where displacement setpoint values are equal to specific relationship FR2939951A1|2010-06-18|SYSTEM FOR SECURING THE DISPLAY OF INSTRUCTIONS ARISING FROM AIR CONTROL EP3637207A1|2020-04-15|System for generating commands to the control unit of a system controlled via a graphic user interface and associated method FR3093585A1|2020-09-11|System for managing an air mission plan intended to be executed by a plurality of platforms to perform a plurality of tasks and associated method FR3106228A1|2021-07-16|Method of monitoring the training of an operator in a simulator FR2960316A1|2011-11-25|Method for transferring cursor located in interactive area to another interactive area of e.g. LCD flat display, of interface in cockpit of aircraft, involves transferring pointer in area when interface of condition of pointer is received
同族专利:
公开号 | 公开日 CA2969959A1|2017-12-20| US10996793B2|2021-05-04| GB201610732D0|2016-08-03| CA2969959C|2019-06-25| US20170364207A1|2017-12-21| BR102017012163A2|2018-10-30| FR3052889B1|2019-11-08| GB2551520A|2017-12-27| GB2551520B|2018-11-21|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US8228305B2|1995-06-29|2012-07-24|Apple Inc.|Method for providing human input to a computer| US5563632A|1993-04-30|1996-10-08|Microtouch Systems, Inc.|Method of and apparatus for the elimination of the effects of internal interference in force measurement systems, including touch - input computer and related displays employing touch force location measurement techniques| JP2008033739A|2006-07-31|2008-02-14|Sony Corp|Touch screen interaction method and apparatus based on tactile force feedback and pressure measurement| US20080055259A1|2006-08-31|2008-03-06|Honeywell International, Inc.|Method for dynamically adapting button size on touch screens to compensate for hand tremor| JP5349493B2|2008-12-04|2013-11-20|三菱電機株式会社|Display input device and in-vehicle information device| JP5052677B2|2008-12-04|2012-10-17|三菱電機株式会社|Display input device| US8432368B2|2010-01-06|2013-04-30|Qualcomm Incorporated|User interface methods and systems for providing force-sensitive input| US20110187651A1|2010-02-03|2011-08-04|Honeywell International Inc.|Touch screen having adaptive input parameter| US8456445B2|2010-04-30|2013-06-04|Honeywell International Inc.|Touch screen and method for adjusting screen objects| JP2012027875A|2010-07-28|2012-02-09|Sony Corp|Electronic apparatus, processing method and program| US9870093B2|2010-11-23|2018-01-16|Ge Aviation Systems Llc|System and method for improving touch screen display use under vibration and turbulence| US9703476B1|2010-12-23|2017-07-11|The Boeing Company|Multi-touch cockpit interface for controlling aircraft systems| US20130033433A1|2011-08-02|2013-02-07|Honeywell International Inc.|Touch screen having adaptive input requirements| US8959435B2|2011-08-23|2015-02-17|Garmin Switzerland Gmbh|System and methods for detecting debris on a touchscreen system display screen| KR101825442B1|2011-10-06|2018-02-05|삼성전자 주식회사|Method and apparatus for scrolling contents| US9229575B2|2011-10-20|2016-01-05|Garmin International, Inc.|Adaptive touchscreen system| US20130100043A1|2011-10-24|2013-04-25|General Electric Company|Method for determining valid touch screen inputs| TWI456454B|2012-02-08|2014-10-11|Acer Inc|Method for processing touch signal and electronic device using the same| US9733707B2|2012-03-22|2017-08-15|Honeywell International Inc.|Touch screen display user interface and method for improving touch interface utility on the same employing a rules-based masking system| US20130314328A1|2012-05-23|2013-11-28|Honeywell International Inc.|Methods and systems for enhancing touch screen operation in a display of an aircraft| US20140035828A1|2012-07-31|2014-02-06|Elwha LLC, a limited liability company of the State of Delaware|Adjusting a displayed widget or delineated touch-selectable area of a touch screen display in response to an approaching user-appendage| US9423871B2|2012-08-07|2016-08-23|Honeywell International Inc.|System and method for reducing the effects of inadvertent touch on a touch screen controller| US20140062893A1|2012-08-28|2014-03-06|Honeywell International Inc.|System and method for reducing the probability of accidental activation of control functions on a touch screen| US9733739B1|2012-09-11|2017-08-15|Amazon Technologies, Inc.|Reducing inadvertent actions on a computing device| US8825234B2|2012-10-15|2014-09-02|The Boeing Company|Turbulence mitigation for touch screen systems| US9442587B2|2012-12-04|2016-09-13|L-3 Communications Corporation|Touch sensor controller responsive to environmental operating conditions| US9128580B2|2012-12-07|2015-09-08|Honeywell International Inc.|System and method for interacting with a touch screen interface utilizing an intelligent stencil mask| US20140240242A1|2013-02-26|2014-08-28|Honeywell International Inc.|System and method for interacting with a touch screen interface utilizing a hover gesture controller| US20150002403A1|2013-06-26|2015-01-01|Honeywell International Inc.|Touch screen and method for adjusting touch sensitive object placement thereon| CN104808836B|2014-01-23|2017-12-01|京瓷办公信息系统株式会社|Electronic equipment and operation assisting method| US9772712B2|2014-03-11|2017-09-26|Textron Innovations, Inc.|Touch screen instrument panel| GB2528245A|2014-07-04|2016-01-20|Jaguar Land Rover Ltd|Apparatus and method for determining an intended target| CA2955169A1|2014-09-03|2016-03-10|University Of Malta|A human machine interface device for aircraft| US11106314B2|2015-04-21|2021-08-31|Dell Products L.P.|Continuous calibration of an information handling system projected user interface| US9858823B1|2015-06-09|2018-01-02|Rockwell Collins, Inc.|Interactive flight plan| US9690426B1|2015-07-27|2017-06-27|Rockwell Collins, Inc.|Heuristic touch interface system and method| US20170210484A1|2016-01-25|2017-07-27|Ge Aviation Systems Llc|Virtual Aircraft Operations Checklist| US10503317B2|2016-02-09|2019-12-10|The Boeing Company|Turbulence resistant touch system| US20180039401A1|2016-08-03|2018-02-08|Ge Aviation Systems Llc|Formatting text on a touch screen display device|FR3028967B1|2014-11-21|2017-12-15|Renault Sas|GRAPHICAL INTERFACE AND METHOD FOR MANAGING THE GRAPHICAL INTERFACE WHEN TACTILE SELECTING A DISPLAYED ELEMENT| CN110077421A|2019-04-18|2019-08-02|广州小鹏汽车科技有限公司|Processing method, processing unit and the Vehicle touch screen of vehicle touch-control incoming event| US20210382602A1|2020-06-05|2021-12-09|International Business Machines Corporation|Automatically correcting touchscreen errors|
法律状态:
2018-06-26| PLFP| Fee payment|Year of fee payment: 2 | 2019-02-15| PLSC| Publication of the preliminary search report|Effective date: 20190215 | 2019-05-22| PLFP| Fee payment|Year of fee payment: 3 | 2020-05-20| PLFP| Fee payment|Year of fee payment: 4 | 2021-05-19| PLFP| Fee payment|Year of fee payment: 5 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 GB1610732.8|2016-06-20| GB1610732.8A|GB2551520B|2016-06-20|2016-06-20|Correction of vibration-induced error for touch screen display in an aircraft| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|