![]()
专利摘要:
The present invention relates to a mobile terminal (200) and its control method, by which a mobile terminal of a watch type (200) can be controlled remotely. The present invention includes a sensor (140), a display (151) and a controller (180) configured to identify a first gesture input based on an input received from the sensor, wherein the first gesture input is identified when the mobile terminal (200) is moved or rotated, and cause the display to display detailed information about an event at the mobile terminal (200) in response to the first identified gesture entry. 公开号:FR3022367A1 申请号:FR1554365 申请日:2015-05-13 公开日:2015-12-18 发明作者:Taeseong Kim;Jisun Lee;Soyeon Yim;Eugene Myung 申请人:LG Electronics Inc; IPC主号:
专利说明:
[0001] The present invention relates to a mobile terminal, and more particularly to a mobile terminal and its control method. BACKGROUND OF THE INVENTION Although the present invention is suitable for a wide range of applications, it is particularly suitable for remotely controlling a mobile terminal of a watch type. [0002] Terminals can generally be classified into mobile / portable terminals or fixed terminals based on their mobility. Mobile terminals can also be classified into portable terminals or onboard terminals depending on whether or not they can be worn directly by a user. Mobile terminals have become more and more functional. Examples of these functions include voice and data communications, image and video capture by means of a camera, audio recording, music file playback using a speaker system, and audio playback. display of images and video on a display. Some mobile terminals include a complementary feature that supports games, while other terminals are configured as media players. More recently, mobile terminals have been configured to receive broadcast and multicast signals that allow viewing of content such as videos and television programs. Efforts are being made to support and increase the functionality of mobile terminals. These efforts include software and hardware enhancements as well as changes and improvements to the structural components. In order to further improve the mobility and portability of a mobile terminal, many current efforts are focused on finding and developing a mobile terminal of a wrist-like type such as a watch. However, since a mobile terminal of a watch type has a display unit of limited size, it will be difficult for a user to touch a screen of the mobile terminal of watch type unlike a terminal. classic mobile. If a new input method for appropriately manipulating a mobile terminal of a watch type is developed, it is able to further facilitate the handling of the watch-type mobile terminal appropriately. [0003] Accordingly, embodiments of the present invention relate to a mobile terminal and its control method that substantially eliminate one or more problems related to the limitations and disadvantages of the related art. [0004] An object of the present invention is to provide a mobile terminal and its control method for improving the comfort of the user. An object of the present invention is in particular to provide a mobile terminal and its control method, whereby the mobile terminal can be controlled remotely based on a gesture input of the user. [0005] Another object of the present invention is to propose a mobile terminal and its control method, making it easier to remotely control the mobile terminal by a user in such a way that a gesture input to effect a change of content or a mode change and a gesture entry for performing a general control operation without the content change or the mode change are distinguished from one another according to a type of a gesture input. Other advantages, objects and features of the invention will be set forth in the disclosure set forth herein as well as the accompanying drawings. Such aspects may also be appreciated by those skilled in the art on the basis of the disclosure set forth herein. To achieve these objects and other advantages and object of the invention, as implemented and widely described herein, a mobile terminal of the type shown according to the present invention can include a sensor, a display and a controller configured to identify a first gesture entry based on an input received from the sensor, where the first gesture entry is identified when the mobile terminal is moved or rotated, and cause the display to display detailed information about an event at the mobile terminal in response to identifying the first gesture entry. In another aspect of the present invention, as practiced and extensively described herein, a method of controlling a watch-type mobile terminal according to the present invention may include identifying a first gesture entry on the basis of the present invention. the input received from a sensor, wherein the first gesture input is identified when the mobile terminal is moved or rotated, and displaying information about an event in response to identifying the first gesture input. The effects obtainable from the present invention may be unrestricted by the aforementioned effect. Other effects not mentioned can also be clearly understood from the following description by one skilled in the art to which the present invention relates. It should be understood that the foregoing general description as well as the following detailed description are provided by way of example and explanation and are intended to provide a further explanation of the invention as claimed. The present invention will be better understood from the following detailed description and accompanying drawings, which are provided by way of illustration only, and are therefore not limitative of the present invention, and in which: FIG. 1 A is a block diagram of a mobile terminal according to the present disclosure; Figures 1B and 1C are conceptual representations of an example of the mobile terminal, viewed from different directions; Fig. 2 is a perspective diagram of an example of a watch-type mobile terminal according to another exemplary embodiment; FIGS. 3A to 3C are diagrams of examples of different gesture entries; Fig. 4 is a flowchart of an operation of a mobile terminal according to the present invention; Figure 5 is a diagram of an example of canceling a power saving mode of a mobile terminal if a gesture of the first type is entered; Fig. 6 is a diagram of an exemplary display of detailed information relating to an event if a gesture of the first type is entered; Fig. 7 is a diagram of an example of producing an indicator indicating an event occurrence if a first type gesture is entered; Fig. 8 is a diagram of an operation of a mobile terminal if a gesture of the first type is entered when displaying detailed information relating to an event; Fig. 9A and Fig. 9B are diagrams of an exemplary operation of a mobile terminal if a second type gesture is inputted. Figure 10 is a diagram of an example of operation of a mobile terminal if a gesture of the second type is entered; Fig. 11 is a diagram of an exemplary operation of a mobile terminal if a second type gesture is inputted; Fig. 12 is a diagram of an exemplary execution of a defined application; Fig. 13A and Fig. 13B are diagrams of an example for describing an operation of a mobile terminal if a second type gesture is entered during the execution of a musical application; Fig. 14 is a diagram of an example for describing an operation of a mobile terminal if a second type gesture is input during the execution of a musical application; Fig. 15 is a diagram of an exemplary operation of a mobile terminal if a second type gesture is inputted; Fig. 16 is a diagram of an example of a time change of reading of a multimedia file; Fig. 17A and Fig. 17B are diagrams of an example of displaying detailed information relating to an event if a gesture of the second type is entered; Fig. 18 is a diagram of an example of changing a mode of operation of a mobile terminal if a second type gesture is entered; Figs. 19A-19C are diagrams of exemplary operations of a mobile terminal if gesture input is received upon receipt of an incoming call; Fig. 20A and Fig. 20B are diagrams of an operation of a mobile terminal if a second type gesture is input during the production of an alarm; Fig. 21 is a diagram of an exemplary setting of a vibration mode and an audio output mode by a gesture input; Fig. 22 is a diagram of an example of setting a brightness of a display unit; Fig. 23 is a diagram of an exemplary unlocking of a display unit; Fig. 24 is a diagram of an exemplary mode of switching from a mobile terminal to a power saving mode; Fig. 25 is a diagram of an example of setting a feedback output according to a type of an event; and Fig. 26A and Fig. 26B are diagrams of an exemplary setting of an application to be executed in response to an input count of a second type gesture. We will now make a detailed description according to embodiments disclosed herein, with reference to the accompanying drawings. For the purpose of a brief description with reference to the drawings, it is possible to assign to the same or equivalent components the same or similar reference numbers, and not to repeat their description. In general, a suffix such as "module" and "unit" can be used to designate elements or components. The use of such a suffix here is purely to facilitate the description of the Memory, and the suffix itself is not meant to provide any special meaning or function. In the present disclosure, what is well known to those skilled in the art has generally been omitted for the sake of brevity. The accompanying drawings are used to facilitate the understanding of various technical features and it should be understood that the embodiments presented herein are not limited by the accompanying drawings. As such, the present disclosure should be construed as extending to any modification, any equivalent and substitute in addition to those specifically defined in the accompanying drawings. It will be understood that well the terms first, second, etc. can be used here to describe different elements, these elements should not be limited by these terms. In general, these terms are only used to distinguish one element from another. It should be understood that when an element is designated as being "connected to" another element, the element may be connected to the other element or intermediate elements may also be present. On the contrary, when an element is designated as being "directly connected to" another element, no intermediate element is present. A singular representation may include a plural representation unless it has an absolutely different meaning in relation to the context. Terms such as "include" or "have" are used here and it must be understood that they are meant to indicate the existence of more than one component, function or step, disclosed in the Brief, and it is also understood that more or less components, functions or steps can also be used. [0006] The mobile terminals presented here can be implemented using a variety of different types of terminals. Examples of these devices include cell phones, smart phones, user devices, laptops, digital broadcast terminals, personal digital assistants (PDAs), portable multimedia players (PMPs - Portable Multimedia Players), browsers, personal computers (PCs), electronic slates, digital tablets, ultra-books, portable devices (eg, smart watches, smart glasses, head-mounted displays), and the like . By way of nonlimiting example only, a more complete description will be made with reference to particular types of mobile terminals. However, such teachings also apply to other types of terminals, such as the aforementioned types. Moreover, these teachings can also be applied to fixed terminals such as digital TV, desktop computers and the like. We will now refer to Figs. 1A-1C, where Fig. 1A is a block diagram of a mobile terminal according to the present disclosure, and FIGS. 1B and 1C of the conceptual representations of an example of the mobile terminal, viewed from different directions. The mobile terminal 100 is shown with components such as a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180 and a power supply unit 190. It should be understood that the use of all the illustrated components is not a requirement, and that it is also possible to use more or fewer components. [0007] If we now look at Figure 1A, the mobile terminal 100 is shown as having a wireless communication unit 110 configured with several components commonly implemented. For example, the wireless communication unit 110 typically includes one or more components that enable wireless communication between the mobile terminal 100 and a wireless communication system or network, within which the mobile terminal is located. The wireless communication unit 110 generally includes one or more modules that make it possible to establish communications such as wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal, communications between the mobile terminal 100 and an external server. In addition, the wireless communication unit 110 generally includes one or more modules that connect the mobile terminal 100 to one or more networks. To facilitate such communications, the wireless communication unit 110 includes one or more of the following: a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-term communication module scope 114 and a location information module 115. The input unit 120 includes a camera 121 for obtaining images or video, a microphone 122, which is a type of audio input device for inputting a signal audio, and a user input unit 123 (for example, a touch key, a push button, a mechanical key, a programmable key and the like) to allow a user to enter information. Data (eg, audio, video, images, and the like) is obtained by the input unit 120 and can be analyzed and processed by the controller 180 according to device parameters, user controls, and combinations thereof. The detection unit 140 is generally implemented by means of one or more sensors configured to detect the internal information of the mobile terminal, the surrounding environment of the mobile terminal, user information, and the like. For example, in FIG. 1A, the detection unit 140 is shown as having a proximity sensor 141 and a lighting sensor 142. If necessary, the detection unit 140 may alternatively or additionally include other types of sensors or devices, such as a touch sensor, an acceleration sensor, a magnetic sensor, a gravity sensor, a gyro sensor, a motion sensor, an RGB sensor, an infrared (IR) sensor, a sensor with digital scanning, an ultrasonic sensor, an optical sensor (for example, a camera 121), a microphone 122, a battery gauge, a room sensor (for example, a barometer, a hygrometer, a thermometer, a sensor radiation detector, a thermal sensor, and a gas sensor, among others), and a chemical sensor (e.g., an electronic nose, a medical sensor, a biometric sensor, and the like), to name a few. only a few. The mobile terminal 100 may be configured to use the information obtained by the detection unit 140, and in particular the information obtained from one or more sensors of the detection unit 140, and their combinations. The output unit 150 is generally configured to output various types of information, such as sound, video, touch output, and the like. The output unit 150 is shown with at least one of a display unit 151, an audio output module 152, a haptic module 153, and an optical output module 154. The display unit 151 may have an intermediate layer structure or integrated structure with a touch sensor to simplify a touch screen. The touch screen may provide an output interface between the mobile terminal 100 and a user, as well as operate as a user input unit 123 which provides an input interface between the mobile terminal 100 and the user. The interface unit 160 serves as an interface to the various types of external devices that can be coupled to the mobile terminal 100. The interface unit 160, for example, may include any of wired or wireless ports, 20 external power ports, wired or wireless data ports, memory card ports, ports for connecting a device with an identification module, input / output ports (I / O ) audio, video I / O ports, headphone ports, and the like. In some cases, the mobile terminal 100 may perform an assortment of control functions associated with a connected external device, when the external device is connected to the interface unit 160. The memory 170 is generally implemented to store data. data for supporting various functions or features of the mobile terminal 100. For example, the memory 170 may be configured to store application programs running in the mobile terminal 100, data or instructions relating to the operation of the mobile terminal. 100, and the like. Some of these application programs can be downloaded from an external server by wireless transmission. Other application programs may be installed in the mobile terminal 100 at the time of manufacture or shipment, which is generally the case for the basic functions of the mobile terminal 100 (for example, receiving a call, establishing a call, receiving a message, sending a message, and the like). It is common for an application program to be stored in the memory 170, installed in the mobile terminal 100 and executed by the controller 180 to perform an operation (or function) for the mobile terminal 100. The controller 180 generally has the function of control the overall operation of the mobile teiminal 100, in addition to the operations associated with the application programs. The controller 180 may provide or process information or functions appropriate for a user by processing the signals, data, information and the like, which are input or output by the various components illustrated in Figure 1A, or by activating the programs. For example, the controller 180 controls some or all of the components illustrated in FIGS. 1A-1C according to the execution of an application program that has been stored in the memory 170. The unit The power supply 190 may be configured to receive external power or provide internal power to provide the amount of power required to operate the elements and components contained in the mobile terminal 100. The power supply unit 190 may include a battery, and the battery can be configured to be integrated into the body of the terminal, or configured to be detachable from the body of the terminal. Referring again to Figure 1A, we will now describe various components illustrated in this figure in more detail. With respect to the wireless communication unit 110, the broadcast receiving module 111 is generally configured to receive a broadcast signal and / or broadcast related information from an external broadcast management entity via a broadcast channel. diffusion. The broadcast channel may include a satellite channel, a terrestrial channel, or both. In some embodiments, two or more broadcast receiving modules 111 may be used to facilitate the simultaneous reception of two or more broadcast channels, or to support switching between the broadcast channels. The broadcast management entity may be crawled by means of a server or system that generates and transmits a broadcast signal and / or broadcast related information, or a server that receives a broadcast signal. pre-generated broadcast and / or broadcast related information, and sends these items to the mobile terminal. The broadcast signal may be implemented by means of any broadcast signal, broadcast signal, data broadcast signal and combinations thereof, among others. The broadcast signal in some cases may further include a data broadcast signal combined with a broadcast or broadcast signal. The broadcast signal may be encoded according to any one of a variety of technical standards or broadcast methods (eg International Organization for Standardization (ISO), International Electrotechnical Commission (IEC) ), Digital Video Broadcast (DVB), Advanced Television Systems Committee (ATSC), and the like) for the transmission and reception of digital broadcast signals. The broadcast receiving module 111 may receive the digital broadcast signals by a method appropriate for the transmission method used. Examples of associated broadcast information may include information associated with a broadcast channel, a broadcast program, a broadcast event, a broadcast service provider, or the like. The associated broadcast information may also be provided via a mobile communication network, and in this case received by the mobile communication module 112. The associated broadcast information may be implemented in various formats. For example, the associated broadcast information may include an Electronic Program Guide (EPG), a Digital Multimedia Broadcasting (DMB), an Electronic Service Guide (ESG) for digital video broadcasting. - Portable (DVB - H - Digital Video Broadcast - Handheld), and the like. [0008] The broadcast signals and / or associated broadcast information received via the broadcast receiving module 111 may be stored in a suitable device, such as a memory 170. [0009] The mobile communication module 112 can transmit and / or receive wireless signals to and from one or more network entities. Typical examples of a network entity include a base station, an external mobile terminal, a server, and the like. These network entities are part of a mobile communication network, which is built in accordance with technical standards or communication methods for mobile communications (for example, the Global System for Mobile Communication (GSM), Code Division Multiple Access (CDMA), 2000 Code Division Multiple Access (CDMA2000), enhanced enhanced voice and data services, or enhanced voice and data services only ( EV-DO - Enhanced VoiceData Only), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), high-speed uplink packet access High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), Advanced Long Term Evolution (LTE-A), and the like. Transmitted wireless signals and / or received via the mobile communication module 112 include audio call signals, video call signals (telephony), or various data formats intended to allow the communication of text messages and multimedia. The wireless Internet module 113 is configured to facilitate wireless Internet access. This module can be internally or externally coupled to the mobile terminal 100. The wireless Internet module 113 can transmit and / or receive wireless signals over communication networks using wireless Internet technologies. Examples of this wireless Internet access include Wireless LAN (WLAN), Wireless Fidelity (Wi-Fi), Wi-Fi Direct, and Digital Living Network Alliance. DLNA - Digital Living Network Alliance), Wireless Broadband (WiBro) access, Worldwide Interoperability for Microwave Access (WiMAX), packet link access High-speed downlink (HSDPA), high-speed uplink packet access (HSUPA), long-term evolution (LTE), advanced long-term evolution (LTE-A), and the like. The wireless Internet module 113 may transmit and / or receive data according to one or more of these wireless Internet technologies, as well as other Internet technologies. In some embodiments, when wireless Internet access is implemented according to, for example, WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A, and the like, in the context of a mobile communication network, the wireless Internet module 113 provides such wireless Internet access. As such, the Internet module 113 may cooperate with or serve as a mobile communication module 112. The short-range communication module 114 is configured to facilitate short-range communications. Technologies suitable for implementing these short-range communications include BLUETOOTHTm, Radio Frequency IDentification (RFID), IrDA (Infrared Data Association) Ultra-Wide Band (UWB), ZigBee, Near Field Communication (NFC), Wireless Fidelity (Wi-Fi), Wi-Fi Direct, Universal Serial Bus Wireless (Universal Serial Bus), and the like. The short-range communication module 114 generally supports wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal 100, or communications between the terminal. mobile and a network where there is another mobile terminal 100 (or an external server), through wireless local area networks. An example of wireless LANs is wireless personal networks. In some embodiments, another mobile terminal (which may be configured in the same manner as the mobile terminal 100) may be a portable device, for example, a smart watch, smart glasses or a headset (HMD), which is able to exchange data with the mobile terminal 100 (or to cooperate in another way with the mobile terminal 100). The short-range communication module 114 can detect or recognize the portable device, and allow communication between the portable device and the mobile terminal 100. In addition, when the detected portable device is a device that has been authenticated to communicate with the mobile device mobile terminal 100, the controller 180, for example, can transmit processed data in the mobile terminal 100 to the portable device via the short-range communication module 114. As a result, a user of the portable device can use the data processed in the mobile terminal 100 on the portable device. For example, when a call is received in the mobile terminal 100, the user can answer the call using the portable device. In addition, when a message is received in the mobile terminal 100, the user can view the received message using the portable device. The location information module 115 is generally configured to detect, calculate, obtain or otherwise identify a position of the mobile terminal. For example, the location information module 115 includes a global positioning system (GPS) module, a Wi-Fi module, or both. If necessary, the location information module 115 may alternatively or additionally operate with any of the other modules of the wireless communication unit 110 to obtain data relating to the position of the mobile terminal. [0010] For example, when the mobile terminal uses a GPS module, the position of the mobile terminal can be acquired through a signal sent by a GPS satellite. As another example, when the mobile terminal uses the Wi-Fi module, the position of the mobile terminal can be acquired from information relating to a wireless access point (AP) that transmits or receives a wireless signal. to or from the Wi-Fi module. The input unit 120 may be configured to allow various types of inputs to the mobile terminal 100. Examples of this input include audio, picture, video, data inputs. , and user input. The image and video inputs are often obtained by means of one or more cameras 121. These cameras 121 can process still image or video image frames obtained by image sensors in video capture or capture mode. picture. The processed image frames may be displayed on the display unit 151 or stored in the memory 170. In some cases, the cameras 121 may be arranged in a matrix configuration to allow input of a plurality of images. Images having various angles or focal length in the mobile terminal 100. For example still, the cameras 121 may be arranged in a stereoscopic arrangement to acquire left and right images allowing the implementation of a stereoscopic image. [0011] The microphone 122 is generally implemented to allow audio input into the mobile terminal 100. The audio input may be processed in a variety of ways according to a function performed in the mobile terminal 100. If necessary, the microphone 122 may include a set of audio inputs. noise suppression algorithms to eliminate unwanted noise produced during the reception of external audio. The user input unit 123 is a component that allows entry made by a user. This user input may allow the controller 180 to control the operation of the mobile terminal 100. The user input unit 123 may include one or more of a mechanical input element (for example, a key, a button located on a front and / or rear surface or on a side surface of the mobile terminal 100, a dome switch, a jog wheel, a jog scroller, and the like), or a touch input, among 'other. As an example, the touch input can be a virtual key or soft key, which is displayed on a touch screen by software processing, or a touch pad that is located on the handheld at a location other than the touch screen. touchscreen. On the other hand, the virtual key or the visual key can be displayed on the touch screen in various forms, for example, graphics, text, icon, video, or a combination thereof. [0012] The detection unit 140 is generally configured to detect one or more of the following information: internal information of the mobile terminal, information about the surrounding environment of the mobile terminal, user information, or the like. The controller 180 generally cooperates with the transmitter unit 140 to control the operation of the mobile terminal 100 or execute a data processing, a function or an operation associated with an application program installed in the mobile terminal as a function of the detection provided by the detection unit 140. The detection unit 140 may be implemented by any of a variety of sensors, some of which will now be described in more detail. [0013] The proximity sensor 141 may include a sensor for detecting the presence or absence of an object approaching a surface, or an object located near a surface, using an electromagnetic field, infrared rays, or similar without mechanical contact. The proximity sensor 141 may be arranged in an internal region of the mobile terminal covered by the touch screen, or near the touch screen. The proximity sensor 141 may include, for example, any one of the following sensors: a transmissive type photoelectric sensor, a direct reflective type photoelectric sensor, a mirror direct reflective type photoelectric sensor, a high oscillation proximity sensor. frequency, a proximity sensor of the capacitive type, a proximity sensor of the magnetic type, an infrared proximity sensor, and the like. When the touch screen used is of the capacitive type, the proximity sensor 141 can detect the proximity. a pointer to the touch screen by the variations of an electromagnetic field, which reacts to the approach of an object without conductivity. In this case, the touch screen (touch sensor) can also be classified in the category of a proximity sensor. The term "proximity touch" will be frequently used herein to refer to a scenario in which a pointer is placed so as to be near the touch screen without touching the touch screen. The term "contact touch" will be frequently used herein to refer to a scenario in which a pointer is physically in contact with the touch screen. For the position corresponding to the proximity touch of the pointer relative to the touch screen, this position will correspond to a position in which the pointer is perpendicular to the touch screen. The proximity sensor 141 can detect a proximity touch, and proximity touch modes (e.g., distance, direction, velocity, time, position, motion state, and the like). In general, the controller 180 processes data corresponding to proximity and proximity touch modes detected by the proximity sensor 141, and displays visual information on the touch screen. In addition, the controller 180 can control the mobile terminal 100 to perform different operations or to process different data depending on whether the touch relative to a point of the touch screen is a proximity touch or a touch contact. A touch sensor can detect a touch applied to the touch screen, such as the display unit 151, by any of a variety of touching methods. Examples of such touching methods include resistive type, capacitive type, infrared type, and magnetic field type, among others. [0014] As an example, the touch sensor may be configured to convert variations of a pressure applied to a specific portion of the display unit 151, or to convert a capacitance appearing in a specific portion of the display unit. 151, as electrical input signals. The touch sensor can also be configured to detect not only an affected position and an affected area, but also tactile pressure and / or tasty ability. A touch object is generally used to apply a touch input to the touch sensor. Examples of conventional tactile objects include a finger, a stylus, a pointer, or the like. When a touch input is detected by a touch sensor, corresponding signals can be transmitted to a touch controller. The touch controller can process the received signals, and then transmit the corresponding data to the controller 180. Thus, the controller 180 can detect the region of the display unit 151 that has been touched. Here, the touch controller may be an independent component of controller 180, controller 180, and combinations thereof. In some embodiments, the controller 180 may execute the same or different commands depending on the type of touch object that touches the touch screen or a touch key provided in addition to the touch screen. Whether or not an identical or different command depending on the object which provides tactile input can be decided on the basis of the current state of operation of the mobile terminal 100 or an application program in progress. execution, for example. The touch sensor and the proximity sensor can be used individually, or in combination, to detect various types of touch. These 25 touches include a short touch (or tapping), a long touch, a multi-touch, a slip-like feel, a touch-like touch, a pinch-like touch, a pinch-like touch, touch-by-touch, hover-touch, and the like. If necessary, an ultrasonic sensor may be implemented to recognize positional information relating to a touch object by means of ultrasonic waves. The controller 180 may, for example, calculate the position of a wave emission source from information detected by a light sensor and a plurality of ultrasonic sensors. Because light is much faster than ultrasonic waves, the time taken by light to reach the optical sensor is much shorter than the time taken by the ultrasonic wave to reach the ultrasonic sensor. The position of the wave emission source can be calculated on the basis of this fact. For example, the position of the wave emission source can be calculated using the time difference from when the ultrasonic wave reaches the sensor and using light as a reference signal. The camera 121 generally includes at least one camera sensor (CCD, CMOS etc.), a photodetector (or image sensors), and a laser sensor. The implementation of the camera 121 with a laser sensor can enable the detection of the touch of a physical object with respect to a 3D stereoscopic image. The photodetector may be superimposed on or overlapped with the display device. The photodetector may be configured to scan the movement of the physical object near the touch screen. To enter into detail, the photodetector may include photodiodes and row and column transistors for scanning content received in the photodetector by means of an electrical signal that varies depending on the amount of light applied. In other words, the photodetector can calculate the coordinates of the physical object according to a light variation to obtain information on the position of the physical object. The display unit 151 is generally configured to deliver processed information in the mobile terminal 100. For example, the display unit 151 may display the information of an execution screen of an application program currently being processed. executing in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information in response to the information of the execution screen. In some embodiments, the display unit 151 may be implemented as a stereoscopic display unit for displaying stereoscopic images. A conventional stereoscopic display unit may employ a stereoscopic display system such as a stereoscopic system (goggle system), an auto-stereoscopic system (goggle-free system), a projection system (holographic system), or the like . [0015] In general, a 3D stereoscopic image may include a left image (for example, an image of the left eye) and a right image (for example, an image of the right eye). Depending on how the left and right images are combined into a 3D stereoscopic image, a 3D stereoscopic imaging process can be divided into a high-low process in which the left and right images are placed up and down on a frame , a L to D process (from left to right or side by side) in which the left and right images are placed to the left and to the right on a frame, a checkerboard process in which pieces of left and right images are in the form of tiles, an interlaced method in which the left and right images are alternately arranged in columns or rows, and a temporal sequence (or frame by frame) in which the left and right images are displayed alternately on a time basis . In addition, as a 3D vignette, a left image thumbnail and a right thumbnail image can be generated from a left image and a right image of an original image frame, respectively, and then combined to generate a single 3D thumbnail. In general, the term "thumbnail" can be used to designate a reduced image or a reduced still image. A left image thumbnail and a right image thumbnail generated can be displayed with a difference in horizontal distance between them by a depth corresponding to the disparity between the left image and the right image on the screen, thereby providing a stereoscopic space detection. A left image and a right image required to implement a 3D stereoscopic image can be displayed on the stereoscopic display unit by means of a stereoscopic processing unit. The stereoscopic processing unit may receive the 3D image and extract the left image and the right image, or may receive the 2D image and change it into a left image and a right image. The audio output module 152 is generally configured to produce audio data. This audio data can be obtained from any of several different sources, so that the audio data can be received from the wireless communication unit 110 or may have been stored in the memory 170. The audio data can be produced during modes such as a signal receiving mode, a calling mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like. The audio output module 152 can provide an audible output relating to a particular function (e.g., a call signal receiving sound, a message receiving sound, etc.) performed by the mobile terminal 100. The audio output 152 may also be implemented in the form of a receiver, a speaker, a buzzer, or the like. A haptic module 153 may be configured to produce various tactile effects that a user feels, perceives, or otherwise feels. The classic example of a tactile effect produced by the haptic module 153 is that of vibrations. [0016] The intensity, pattern, and the like of a vibration produced by the haptic module 153 may be controlled by a user selection or parameterization by the controller. For example, the haptic module 153 may emit different vibrations in a combined manner or in a sequential manner. In addition to the vibrations, the haptic module 153 can produce various other tactile effects, such as a stimulation effect such as a vertically movable pin arrangement for touching the skin, a spraying force, or an air suction force through an orifice. jet or suction opening, a touch of the skin, the contact of an electrode, an electrostatic force, an effect of reproducing the sensation of heat or cold with the aid of an element capable of absorbing or to produce heat, and the like. The haptic module 153 can also be implemented to allow the user to feel a tactile effect through a muscular sensation such as the fingers or the arm of the user, as well as to transmit the tactile effect through 'a direct contact. Two or more haptic modules 153 may be provided depending on the particular configuration of the mobile terminal 100. An optical output module 154 may emit a signal to indicate the creation of an event by light from a light source. Examples of events created in the mobile terminal 100 include receiving a message, receiving a call signal, a missed call, an alarm, a calendar warning, receiving a electronic message, receipt of information through an application, and the like. A signal emitted by the optical output module 154 may be implemented in such a way that the mobile terminal emits a monochromatic light or a light having a plurality of colors. The transmitted signal may end when the mobile terminal detects that a user has consulted the created event, for example. The interface unit 160 serves as an interface for connecting external peripherals to the mobile terminal 100. For example, the interface unit 160 can receive data transmitted by an external device, receive energy to be transmitted to elements and components of the mobile terminal 100, or transmit internal data of the mobile terminal 100 to such an external device. The interface unit 160 may include wired or wireless headphone ports, external power ports, wired or wireless data ports, memory card ports, ports for connecting a device with an identification module, audio input / output (I / O) ports, video I / O ports, headphone ports, or the like. The identification module may be a chip that stores various information for authenticating the use authority of the mobile terminal 100 and may include a user identity module (UIM), a module of subscriber identity (SIIVI), a Universal Subscriber Identity Module (USIM), and the like. In addition, the device with the identification module (also called here "identification device") can take the form of a smart card. Consequently, the identification device can be connected to the terminal 100 via the interface unit 160. When the mobile terminal 100 is connected to an external cradle, the interface unit 160 can be used as a passage to allow energy from the cradle to power the mobile terminal 100 or may be used to allow various control signals entered by the user from the cradle to be transmitted to the mobile terminal therethrough. Various control signals or a power input from the cradle may act as signals to recognize that the mobile terminal is properly mounted on the cradle. [0017] The memory 170 may store programs to operate the controller 180 and store input and / or output data (e.g., a directory, messages, still images, videos, etc.). The memory 170 can store data relating to the various vibration and sound patterns that are emitted in response to touch inputs on the touch screen. The memory 170 may include one or more types of storage media, including a flash memory, a hard disk, a solid state disk, a silicon disk, a media card micro type, a card type memory (for example , SD or DX memory, etc.), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), read-only memory (ROM) electrically erasable Programmable Read-Only Memory (EEPROM), programmable read-only memory (PROM), magnetic memory, magnetic disk, optical disk, and the like. The mobile terminal 100 can also be used in conjunction with a network storage device that provides the storage function of the memory 170 on a network, such as the Internet. The controller 180 can generally control the general operations of the mobile terminal 100. For example, the controller 180 can set or remove a lockout state to prevent a user from entering a control command by relating to applications when the state of the mobile terminal fulfills a predefined condition. The controller 180 may also perform the checks and processes associated with voice calls, data transmissions, video calls, and the like, or perform pattern recognition processing to recognize a handwritten input or a pattern input. image performed on the touch screen as characters, or images, respectively. In addition, controller 180 may control one or a combination of these components to implement various exemplary embodiments disclosed herein. The power supply unit 190 receives external energy or provides internal energy and provides the amount of energy required to operate the respective elements and components contained in the mobile terminal 100. The power supply unit 190 may include a battery, which is generally rechargeable or which is releasably coupled to the terminal body for recharging. The power supply unit 190 may include a connection port. The connection port may be configured as an example of the interface unit 160 to which an external charger is electrically connected to provide power for recharging the battery. As another example, the power supply unit 190 may be configured to recharge the battery in a non-wired manner without using the connection port. [0018] In this example, the power supply unit 190 can receive energy, transmitted by an external wireless energy transmitter, using at least one of an inductive coupling method based on magnetic induction or a resonance method. magnetic which relies on electromagnetic resonance. Various embodiments described herein may be implemented on a computer readable medium, a machine readable medium, or a similar medium by, for example, software, hardware, or combination. any of these. Referring now to FIGS. 1B and 1C, the mobile terminal 100 is described with reference to a bar-type terminal body. However, the mobile terminal 100 may also be implemented in any one of a number of different configurations. Examples of such configurations are the watch type, the fastener type, the spectacle type, or a flap type, a flapper type, a slide type, a tilt type, and a swivel type in which two or more bodies are associated with each other in a relatively mobile manner, and their combinations. In the present, the discussion will often relate to a particular type of mobile terminal (eg, bar type, watch type, eyeglass type, and the like). However, such teachings relating to a particular type of mobile terminal generally apply to other types of mobile terminals. The mobile terminal 100 generally includes a housing (e.g., a chassis, housing, cache, and the like) forming the appearance of the terminal. In this embodiment, the housing is formed using a front housing 101 and a rear housing 102. Various electronic components are integrated in a space formed between the front housing 101 and the rear housing 102. At least one central housing can be placed furthermore between the front housing 101 and the rear housing 102. [0019] The display unit 151 is shown as being on the front side of the terminal body for providing information. As seen, a window 151a of the display unit 151 may be mounted on the front housing 101 to form the front surface of the body of the terminal together with the front housing 101. [0020] In some embodiments, electronic components may also be mounted on the back box 102. Examples of such electronic components include a removable battery 191, an identification module, a memory card, and the like. The back cover 103 is shown as covering the electronic components, and this cover can be detachably coupled to the back box 102. Therefore, when the back cover 103 is detached from the back box 102, the electronic components mounted to the back box 102 are exposed to the outdoors. As seen, when the back cap 103 is coupled to the back box 102, a side surface of the back box 102 is partially exposed. In some cases, during the coupling, the rear case 102 may also be completely protected by the back cover 103. In some embodiments, the back cover 103 may include an opening for exposing a camera 121b or a camera to the outside. audio output module 152b. [0021] The housings 101, 102, 103 may be formed by injection molding of a synthetic resin or may be made of a metal, for example, stainless steel, aluminum (Al), titanium (Ti), or similar. As an alternative to the example in which the plurality of housings form an internal space for housing components, the mobile terminal 100 may be configured such that a single housing forms the internal space. In this example, a mobile terminal 100 having a monobody is formed in such a way that a synthetic resin or metal extends from a side surface to a back surface. If necessary, the mobile terminal 100 may include a sealed unit (not shown) for preventing water from entering the body of the terminal. For example, the sealed unit may include a sealing member that is located between the window 151 and the front housing 101, between the front housing 101 and the rear housing 102, or between the rear housing 102 and the back cover 103, to seal an internal space when these boxes are coupled. [0022] Figures 1B and 1C illustrate some components as arranged on the mobile terminal. However, it should be understood that alternative arrangements are possible and within the scope of the teachings of the present disclosure. Some components may be deleted or rearranged. For example, the first handling unit 123a may be located on another surface of the terminal body, and the second audio output module 152b may be located on a side surface of the terminal body. The display unit 151 outputs processed information into the mobile terminal 100. The display unit 151 may be implemented by means of one or more suitable display devices. Examples of such suitable display devices include a Liquid Crystal Display (LCD), a Thin Film Transistor Liquid Crystal Display (TFTLCD), a light emitting diode (LCD). organic (OLED - Organic Light Emitting Diode), a flexible display, a three-dimensional display (3D), an electronic ink display, and their combinations. The display unit 151 can be implemented by means of two display devices, which can implement identical or different display technologies. For example, a plurality of display units 151 may be arranged on one side, spaced from each other, or these devices may be integrated, or these devices may be arranged on different surfaces. The display unit 151 may also include a touch sensor that detects a touch input received in the display unit. When a touch is input to the display unit 151, the touch sensor may be configured to detect that touch and the controller 180 may, for example, produce a control command or other signal corresponding to the touch. Content that is entered in a tactile manner can be a text or numerical value, or a menu item that can be indicated or designated according to various modes. The touch sensor may be configured as a film having a touch pattern, disposed between the window 151a and a display on a rear surface of the window 151a, or a wire whose pattern is printed directly on the surface. back of the window 151a. Alternatively, the touch sensor may be formed integrally with the display. For example, the touch sensor may be disposed on a substrate of the display or in the display. The display unit 151 may also form a touch screen together with the touch sensor. Here, the touch screen can serve as a user input unit 123 (see Figure 1). Therefore, the touch screen can replace at least a portion of the functions of the first handling unit 123a. The first audio output module 152a may be implemented as a speaker for transmitting audio voices, alarm sounds, multimedia audio reproduction, and the like. The window 151a of the display unit 151 generally includes an opening for allowing audio content produced by the first audio output module 152a to pass. A first solution is to allow the release of audio content along an assembly gap between the structural bodies (for example, a gap between the window 151a and the front housing 101). In this case, an independently formed hole for emitting audio sounds may not be seen or otherwise hidden in appearance, further simplifying the appearance and fabrication of the mobile terminal 100. The optical output module 154 may be configured to emit light to indicate the creation of an event. Examples of such events include receiving a message, receiving a call signal, a missed call, an alarm, a calendar notification, receiving an electronic message, receiving a message, and information through an application, and the like. When a user has viewed a created event, the controller may control the optical output unit 154 to stop the light emission. The first camera 121a can process image frames such as still or moving images obtained by the image sensor in capture mode or in video call mode. The processed image frames can then be displayed on the display unit 151 or stored in the memory 170. [0023] The first and second handling units 123a and 123b are examples of the user input unit 123, which can be manipulated by a user to provide an input to the mobile terminal 100. The first and second handling units 123a and 123b can Also commonly referred to as a manipulative part, and may employ any tactile method that allows the user to perform manipulation such as touching, pressing, scrolling, or the like. The first and second handling units 123a and 123b may also employ any non-touch method that allows the user to perform manipulation such as proximity touch, overflight, or the like. [0024] FIG. 1B represents the first handling unit 123a in the form of a touch key, but among the possible alternatives are a mechanical key, a key to be pressed, a touch key and their combinations. An input received by the first and second handling units 123a and 123b can be used in various ways. For example, the first handling unit 123a may be used by the user to provide menu entry, a home key, a cancel, a search, or the like, and the second handling unit 123b may be used by the user. user to provide input to control the sound volume output from the first or second audio output modules 152a or 152b, to switch to a touch recognition mode of the display unit 151, or the like. As another example of the user input unit 123, a rear input unit (not shown) may be located on the rear surface of the terminal body. The rear input unit may be manipulated by a user to provide an input to the mobile terminal 100. The input may be used in a variety of different ways. For example, the rear input unit may be used by the user to provide a start and / or stop input, start, end, scroll, to control the sound volume emitted by the first or second 152a or 152b audio output modules, switch to a touch recognition mode of the display unit 151, and the like. The rear input unit may be configured to allow touch input, one-touch input, or combinations thereof. The rear input unit may be located so as to overlap the display unit 151 on the front side in the thickness direction of the terminal body. As an example, the rear input unit may be located on an upper end portion of the back side of the terminal body so that a user can easily manipulate it using the index when the user enters the terminal body with one hand. Alternatively, the rear input unit can be positioned almost anywhere on the back side of the terminal body. Embodiments that include the rear input unit may implement some or all of the functionality of the first handling unit 123a in the rear input unit. Thus, in cases where the first handling unit 123a is removed from the front side, the display unit 151 may have a larger screen. [0025] As another possibility, the mobile terminal 100 may include a digital scanning sensor which will scan the fingerprint of the user. The controller 180 can then use the fingerprint information detected by the digital scanning sensor as part of an authentication procedure. The digital scanning sensor may also be installed in the display unit 151 or implemented in the user input unit 123. The microphone 122 is shown as being located at one end of the mobile terminal 100, but other locations are possible. If necessary, several microphones can be implemented, according to an arrangement that allows the reception of stereo sounds. The interface unit 160 may serve as a path for the mobile terminal 100 to communicate with external devices. For example, the interface unit 160 may include one or more of the following: a connection terminal enabling it to connect to another device (for example, a listener, an external speaker, or the like), a port for near-field communications (for example, an infrared data association (IrDA) port, a Bluetooth port, a wireless LAN port, and the like), or a power terminal for powering the mobile terminal 100. The interface unit 160 may be implemented in the form of a receptacle intended to receive an external card, for example a subscriber identification module (SIM), a subscriber identification module. User Identity Module (UIM), or a memory card for storing information. The second camera 12 lb is shown as being located on the rear side of the terminal body and includes an image capture direction which is substantially opposite to the image capture direction of the first camera unit 121a. If necessary, a second camera 121a may alternatively be located at other locations, or made to be movable, to have an image capture direction different from that shown. The second camera 121b may include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix configuration. The cameras can be called "network camera". When the second camera 121b is implemented as a network camera, images can be captured in various ways by means of the plurality of lenses and produce better quality images. As shown in FIG. 1C, a flash 124 is shown next to the second camera 121b. When an image of a subject is captured with the camera 121b, the flash 124 may illuminate the subject. As shown in FIG. 1B, the second audio output module 152b may be located on the body of the terminal. The second audio output module 152b may implement stereophonic sound functions together with the first audio output module 152a, and may also be used to implement a speakerphone mode for voice communications. At least one wireless transmission antenna may be located on the body of the terminal. The antenna can be installed in the body of the terminal or formed by the housing. For example, an antenna that configures a portion of the broadcast receiving module 111 may be retractable into the body of the terminal. Alternatively, an antenna may be formed by means of a film attached to an inner surface of the back cover 103, or a housing that includes a conductive material. A power supply unit 190 for powering the mobile terminal 100 may include a battery 191, which is mounted in the terminal body or removably coupled outside the terminal body. [0026] The battery 191 can receive power via a power cable connected to the interface unit 160. In addition, the battery 191 can be recharged in a non-wired manner by means of a wireless charger. Wireless recharging can be implemented by magnetic induction or electromagnetic resonance. The back cover 103 is shown coupled to the back box 102 to protect the battery 191, to prevent separation of the battery 191, and to protect the battery 191 from external shock or foreign matter. When the battery 191 is detachable from the body of the terminal, the rear housing 103 can be detachably coupled to the rear housing 102. An accessory for protecting the appearance of the mobile terminal 100 or facilitating or completing its operation can also be provided on the mobile terminal 100. To give an example of an accessory, a cover or cover for covering or receiving at least one surface of the mobile terminal 100 may be provided. The cover or the cover can cooperate with the display unit 151 to complete the operation of the mobile terminal 100. Another example of an accessory that may be mentioned is a stylus that facilitates a tactile input or transmits it to a screen touch. Fig. 2 is a perspective view illustrating an example of a watch type mobile terminal 200 according to another exemplary embodiment. As shown in Fig. 2, the watch type mobile terminal 200 includes a main body 201 with a display unit 351 and a band 202 connected to the main body 201 to be portable on a wrist. In general, the mobile terminal 200 may be configured to include features that are the same as or similar to those of the mobile terminal 100 of Figs. 1A-1C. The main body 201 may include a housing having a certain appearance. As illustrated, the housing may include a first housing 201a and a second housing 201b cooperatively defining an internal space for accommodating various electrical components. Other configurations are possible. For example, only one package may alternatively be implemented, which package is configured to define the internal space, thereby implementing a mobile terminal 200 with a single body. The watch type mobile terminal 200 can provide wireless communication, and an antenna for wireless communication can be installed in the main body 201. The antenna can extend its function using the housing. For example, a housing including a conductive material may be electrically connected to the antenna to extend an earth zone or radiation zone. The display unit 351 is shown as being located on the front side of the main body 201, so that the displayed information is visible to a user. In some embodiments, the display unit 351 includes a touch sensor, so that the display unit can function as a touch screen. As illustrated, a window 351a is positioned on the first housing 201a to form a front surface of the terminal body in conjunction with the first housing 201a. The illustrated embodiment includes an audio output module 352, a camera 321, a microphone 322 and a user input unit 323 positioned on the main body 201. When the display unit 351 is implemented as a touch screen, the additional function keys can be minimized or eliminated. For example, when the touch screen is implemented, the user input unit 323 may be omitted. The band 202 is commonly worn on the wrist of the user and may be made of a flexible material to facilitate the wearing of the device. For example, the band 202 may be made of fur, rubber, silicone, synthetic resin or the like. The web 202 may also be configured to be detachable from the main body 201. Accordingly, the web 202 may be replaceable by different types of webs according to a user's preference. In one configuration, the band 202 may be used to extend the performance of the antenna. For example, the tape may include a ground extension portion (not shown) electrically connected to the antenna for extending an earth zone. The band 202 may include a fastener 202a. Fastener 202a may be implemented as a loop type, a snap hook structure, a Velcro type or the like, and include a section or flexible material. The drawing illustrates an example where the fastener 202a is implemented by means of a loop. For the sake of clarity and ease of the following description, consider that a mobile terminal 100 according to the present invention is the watch-type mobile terminal shown in Figure 2 for example. Consider further that the mobile terminal 100 according to the present invention includes at least one of the components shown in the figures. 1A to 1C. In particular, the mobile terminal 100 according to the present invention can be considered as including the wireless communication unit 110, the detection unit 140, the display unit 151 and the controller 180. The mobile terminal 100 according to the The present invention can receive a gesture input from a user based on a detection signal of the detection unit 140. Thus, the controller 180 is able to determine what type of gesture input is received by at least one factors including a moving distance of the mobile terminal 100, an angle of rotation of the mobile terminal 100 and a speed of the mobile terminal 100. Figures 3A-3C are diagrams of examples of different gesture entries. As in the example shown in FIG. 3A (a), the mobile terminal 100 can receive a gesture input made in such a way that a band carrying a watch-type mobile terminal is stretched to a ground and then moves by being folded towards a body of a user. As in the example shown in Figure 3A (b), the mobile terminal 100 can receive a gesture input made in the manner of a stretch of a folded tape to a user body. As in the example shown in Figure 3B (a), the mobile terminal 100 can receive a gesture input made in the manner of a twist of a wrist on which a mobile terminal type watch is worn. As in the example shown in Figure 3B (b), the mobile terminal 100 can receive a gesture input made in the manner of an oscillation of a band on which a mobile terminal type watch is worn. As in the example shown in Fig. 3B (c) or Fig. 3B (d), the mobile terminal 100 may receive a gesture input of moving the mobile terminal 100 a short distance toward a body of user or away from it. The mobile terminal 100 according to the present invention can receive a small gesture input by detecting a muscular movement of a wrist portion on which the mobile terminal 100 is worn. To this end, the mobile terminal may include a magnetic sensor configured to detect a magnetic variation according to a movement of the muscle, a pressure sensor configured to detect a pressure applied by a muscle movement, an ultrasonic wave sensor configured to detect a muscle movement emitting ultrasonic waves and then detecting reflected ultrasonic waves, and the like. When the fingers are used, each finger uses a different muscle. The overall muscles used to move a single finger are different from the overall muscles used to move multiple fingers. When muscles are used to move finger (s), they also change in a different way. Therefore, based on a detection signal of the detection unit 140 configured to detect a change in a muscular movement, the controller 180 is able to determine how a specific finger of a user has moved. . By way of illustration, as in the example shown in FIG. 3C (a), the mobile terminal 100 can receive a gesture input made in the manner of a two-finger gap (for example, an inch). and an index, etc.). In addition, as in the example shown in FIG. 3C (b), the mobile terminal 100 can receive a gesture input made in the manner of a two-finger approach (for example, one inch and one index, etc.). [0027] Thus, in order to distinguish a movement of a user arm in its everyday working and living environments from a gesture input, the controller 180 may further consider a moving speed of the mobile terminal 100. In particular, when the mobile terminal 100 is moved or rotated, if a moving speed of the mobile terminal 100 is greater than or equal to a defined speed, it can be determined as a gesture input. On the other hand, if a moving speed of the mobile terminal 100 is lower than the set speed, it can be determined as not being a gesture input. Different types of gesture entries shown in FIGS. 3A-3C can be classified into a first-type gesture and a second-type gesture based on at least one of a moving distance of the mobile terminal 100. a rotated angle of the mobile terminal 100 and a moving speed of the mobile terminal 100. As an illustration, as in the example shown in Figure 3A, the first type of gesture may include a moving action of the terminal type 15 mobile watch 100 worn on a wrist over a distance greater than or equal to a defined distance or a rotation action of the mobile terminal 100 of the watch type worn on a wrist at an angle greater than or equal to a defined angle while being worn on a wrist. On the other hand, as in the example shown in FIG. 3B, the second type gesture may include a moving action of the type 20 mobile terminal 100 worn on a wrist for a distance less than a defined distance or an action. rotation of the mobile terminal 100 of the watch type worn on a wrist at an angle less than the defined angle. As in the example shown in Figure 3C, the second type of gesture may include an action of moving a finger (s) of a hand on which the mobile terminal 100 is worn. If the first type gesture is entered, the mobile terminal 100 according to the present invention can control the change of a displayed content or a mode of operation. For example, if the first type gesture is entered, the controller 180 may change displayed content by commanding the display unit 151 to proceed to a verification screen of a detailed history of an event during production. A home screen (or a locked screen), by commanding the display unit 151 to produce a defined application when producing a home screen (or lock screen), controlling the display unit 151 to produce a second application when producing a first application, or the like. As another example, while the controller 180 is operating in read mode to check event content by displaying detailed information about an event, if the first type gesture is input, the controller 180 can change a mode. operating the mobile terminal 100 in a call or write mode (for example, a text entry state, etc.). If the second type gesture is entered, the mobile terminal 100 may control the execution of a command operation, which is appropriate for a running application, without changing a displayed content. Operation of the mobile terminal 100 in response to an input of a first type / second type gesture is described in detail with reference to the accompanying drawings as follows. Figure 4 is a flowchart of an operation of a mobile terminal 100 according to the present invention. If we look at Fig. 4, when a first type gesture is entered [S401], if the mobile terminal 100 is in power saving mode [S402], the controller 180 cancels the saving mode of power of the mobile terminal 100 and controls the production of a current time through the display unit 151 [S403]. In this case, the power saving mode keeps the display unit 151 in an OFF state to reduce power consumption of the mobile terminal 100. If a defined condition is met, the mobile terminal 100 can enter the power saving mode. For example, if an input is not applied to the mobile terminal 100 for a set time or a user input requesting to enter the power saving mode is received, the mobile terminal 100 can enter the mode of operation. energy saving. For example, if an event occurs in the mobile terminal 100 or an input to cancel the power saving mode is received, the power saving mode can be canceled. Fig. 5 is a diagram of an example of canceling a power save mode if a first type gesture is entered. If we look at FIG. 5, while the mobile terminal 100 is operating in a power saving mode, as in the example shown in FIG. 5 (a), the display unit 151 keeps an off state ( OFF). When the mobile terminal 100 operates in power save mode, if a first type gesture is input, as in the example shown in Fig. 5 (b), the controller 180 cancels the power save mode of the mobile terminal 100 and is able to control the display of current time information through the display unit 151. In the example shown in FIG. 5 (b), the first type gesture includes an action of bringing a hand carrying the mobile terminal 100 to the body of a user. Although the mobile terminal 100 does not operate in the power saving mode, in the presence of an event whose verification has not been ensured by a user in the mobile terminal 100 [S404], the controller 180 can controlling the display of detailed event information through the display unit 151 [S4051. In this case, the event may include one of a message reception (for example, a text message, an instant message, a push message, etc.), a presence of a missed missed call. , reception of an e-mail, a case of meeting an alarm generation condition (for example, a case in which a current time is an alarm time, a case in which a current time is a start time of a pre-recorded program, etc.), and the like. In addition, the detailed information (or content) of the event may include one of a detailed content of a message (e.g., textual content of a text message, textual content of an instant message , etc.), a phone number of an incoming call missed, a detailed content of an email, an alarm content (for example, an alarm time, a content detailed of a pre-recorded program, etc.), and the like. Fig. 6 is a diagram of an example of displaying detailed information about an event if a first type gesture is entered. Referring to Figure 6, in which, if an event occurs, as in the example shown in Figure 6 (a), the controller 180 is able to control the generation of feedback while displaying an indicator that an event has occurred. In the example shown in the figure. 6 (a), information indicating the occurrence of the event includes a guide text indicating that a message has been received. The feedback can be implemented by vibration or sound emission, in which the present invention is not limited. Then, if a first type gesture input is received in the manner of a user moving a hand carrying the mobile terminal 100, as in the example shown in FIG. 6 (b), the controller 180 can order the display of detailed information about the event. In the example shown in Fig. 6 (b), the first type gesture input includes a user action of bringing an arm to a user body. When the first type gesture is entered, the detailed content of the received message is displayed. [0028] Once the mobile terminal 100 has entered a power saving mode after a predetermined time from the occurrence of the event, if the first type gesture is entered, the controller 180 cancels the first mode. energy saving and is also able to control the generation of an indicator, which indicates that the event has occurred. [0029] For example, Figure 7 is a diagram of an example of generating an indicator indicating the occurrence of an event if a first type gesture is entered. Referring to Figure 7, in which, if an event occurs, as in the example shown in Figure 7 (a), as mentioned in the previous description with reference to Figure 6 (a), the controller is able to generate information indicating that the event has occurred. If a definite time has elapsed from the occurrence of the event, as in the example shown in FIG. 7 (b), the controller 180 enters a power saving mode and is therefore suitable to turn off the display unit 151. [0030] Then, if a first type gesture is entered, as in the example shown in Fig. 7 (c), the controller 180 can control the production of an indicator, which indicates that the event has occurred. If the indicator is touched or if a first type touch gesture is entered again, the controller 180 may be able to display detailed information about the event. While the event details are displayed, if the first type gesture is reentered [S406], the controller 180 can respond to the event that occurred [S4071. In particular, while a telephone number of a missed incoming call is displayed, if the first type gesture is entered, the controller 180 may make a telephone call to the corresponding number or may control the display of a screen message composition, which is intended to compose a message to be sent to the corresponding telephone number. While the detailed content of the message is displayed, if the first type gesture is entered, the controller 180 may control the display of a message composition screen, which is intended to compose a response to a counterpart having sent the message , or can make a phone call to the peer that sent the message. For example, Figure 8 is a flow diagram of a mobile terminal if a first type gesture is entered when displaying detailed information about an event. Let's look at Figure 8 and consider that a detailed content of a message is currently displayed [FIG. 8 (a)]. While the detailed content of the message is displayed, as in the example shown in Fig. 8 (b), if a gesture input of bringing an arm carrying the mobile terminal 100 to a user body is received, the controller 180 may control the display of a message composition screen, which is intended to compose a response to a counterpart having sent the message. In this case, the message composition screen may include a text body region 810 to display input content and a keyboard region 820 usable for entering text, in which the present invention is not limited. Meanwhile, while the detailed content of the message is displayed, if a first type of gesture input is received, the controller 180 can make a phone call to a counterpart having sent the message [not shown in the drawing]. Thus, it goes without saying that a gesture entry to compose a response to the message may be different from a gesture entry to make a phone call. While the detailed content of the event is displayed, if a second type gesture is entered, the controller 180 may scroll on the display unit 15 to control the display of content, which was not displayed on the display unit 151 in the detailed content of the event, or to control the display of detailed content of a subsequent event. By way of illustration, FIG. 9A and FIG. 9B are diagrams representing for example an operation of a mobile terminal if a gesture of the second type is entered. For the sake of clarity of the following description, as in the examples shown in Fig. 9A (a) and Fig. 9A (b), consider that a detailed content of a message is currently displayed by means of the message unit. 151. While the detailed content of the message is displayed, as in the example shown in FIG. 9A (b), if a twisting gesture of a wrist carrying the mobile terminal 100 is entered and a speed of the input (for example, a moving speed of the mobile terminal 100, etc.) of the gesture is less than a defined speed, the controller 180 can scroll through the detailed content of the message in response to the gesture input. If the wrist is twisted upward relative to the display unit 151, the controller 180 may scroll the detailed contents of the message down (or up). If the wrist is twisted in the downward direction of the display unit 151, the controller 180 may scroll the detailed contents of the message upward (or downward). While the detailed content of the message is displayed, as in the example shown in FIG. 9B (b), if a twist gesture of a wrist carrying the mobile terminal 100 is entered and an entry speed of the gesture is greater than or equal to a defined speed, the controller 180 may control the display of a detailed content of a next (or previous) message in response to the gesture input. If the wrist is twisted upward relative to the display unit 151, the controller 180 can control the display of the detailed content of the next (or previous) message. If the wrist is twisted downward relative to the display unit 151, the controller 180 may control the display of the detailed content of the previous (or next) message. While a detailed content of an event is displayed, if a second type gesture is entered, the controller 180 may enlarge or reduce an output of the display unit 151. By way of illustration, FIG. a diagram representing for example an operation of the mobile terminal 100 if a gesture of the second type is entered. For the sake of clarity of the following description, as in the examples shown in Fig. 10 (a), consider that a detailed content of a message is currently displayed through the display unit 151. While a detailed content of a message is displayed, as in the example shown in Fig. 10 (b), if a gesture entry consisting of adjusting a distance between two fingers (for example, a thumb and an index, etc.). ) With one hand carrying the mobile terminal 100 is received, the controller 180 can enlarge or reduce an output of the display unit 151 in response to the gesture input. If a gesture entry consisting of moving two fingers apart is received, the controller 180 can enlarge (or reduce) the output of the display unit 151. If a gesture entry consisting of bringing two fingers together is received, the controller 180 can reduce (or enlarge) the output of the display unit 151. While a detailed content of an event is displayed, if a gesture of the second type is entered, the controller 180 can control the generation of a sound, which corresponds to a text-to-sound conversion (TTS - Test To Sound), a detailed record of the event. By way of illustration, FIG. 11 is a diagram of an example of operation of a mobile terminal if a gesture of the second type is entered. For the sake of clarity of the following description, as in the example shown in Fig. 11 (a), consider that a detailed content of a message is currently displayed by means of the display unit 151. Then that a detailed content of a message is displayed, as in the example shown in Fig. 11 (b), if a five finger stretch gesture input is received, the controller 180 can control the production of a result of converting the detailed content (e.g. text, etc.) of the message to a sound in response to the gesture input. While the sound is produced, if a gesture input of folding fingers is received, the controller 190 can control the sound stop. While the result of converting the detailed content of the message is produced, as in the example shown in Fig. 11 (c), if a user input of twisting the wrist is received, the controller 180 can control the production of the message. a result of converting a detailed content of a next or previous message to a sound while displaying the detailed content of the next or previous message. If the wrist is twisted upward relative to the display unit 151, the detailed content of the next (or previous) message may be displayed. If the wrist is bent downward relative to the display unit 151, the detailed content of the previous (or next) message may be displayed. In the examples shown in FIG. 9A and FIG. 9B, a twisting action of a wrist is a scrolling gesture or switching input of a message, the scrolling or switching of the message being determined as a function of a twist of the wrist. It goes without saying, moreover, that a gesture different from that shown in FIG. 9A or FIG. 9B can be attributed to the scrolling or switching of a message. Moreover, although FIG. 10 shows that a gesture of adjustment of a distance between the fingers is attributed to enlarge or reduce an output of the display unit 151, it goes without saying that a gesture different from that shown in Fig. 10 may be used to enlarge or reduce an output of the display unit 151. And the present invention is not limited to the gesture input shown as an example in Fig. 11. [0031] When the first type gesture is entered [S401], if the mobile terminal 100 is not in the state of energy saving [S402] and there is no event that has not been checked by a user in the mobile terminal [S404], the controller 180 may execute a defined application or may control the switching of an application running on a defined application [S408]. [0032] Fig. 12 is a diagram of an exemplary execution of a defined application. If we look at Fig. 12, while a current time is outputted [Fig. 12 (a)], if a first type gesture is entered, the controller 180 can control the execution of a first application [Fig. 12 (13)]. [0033] While a defined application is running, if the first type gesture is reentered, as in the example shown in Fig. 12 (c), controller 180 stops execution of the first application and can command the execution of a second application. In this case, each of the first application and the second application can be determined in order to execute a most recent application or to be used frequently by a user. Alternatively, each of the first application and the second application is randomly designated by a user. As in the example shown in FIG. 12, if a gesture entry of the first type is received, it is able to increase the accessibility of a user to a defined application by executing the defined application. While the defined application is executed, if a second type gesture is entered, the controller 180 may perform an execution command of the defined application in response to the second type gesture. [0034] For example, while an application (for example, a music application, a video application, etc.) to play a multimedia file (for example, a music file, a video file) is running, if a gesture of second type is inputted, the controller 180 may perform such operation as a fast forward (or a fast rewind) command, a next / previous file read command, a play / pause command, and the like. Fig. 13A and Fig. 13B are diagrams of an example for describing an operation of a mobile terminal if a second type gesture is input during the execution of a musical application. If we look at FIG. 13A, while playing a music file, as in the example shown in FIG. 13A (a), if a gesture of shaking a hand carrying the mobile terminal 100 of a user's body out is entered and an input speed (i.e., moving speed of the mobile terminal 100) of the gesture is less than a set speed, the controller 180 may perform a fast forward operation (or a fast return operation) in response to the gesture input. On the contrary, during the playback of a musical file, as in the example shown in Fig. 13A (b), whether a gesture of shaking a hand carrying the mobile terminal 100 to a user body is entered and that an input speed (i.e., moving speed of the mobile terminal 100) of the gesture is less than a set speed, the controller 180 can perform a fast return operation (or a fast forward operation) in response to the gesture entry. While playing a music file, as in the example shown in Fig. 13B (a), if a gesture of shaking a hand carrying the mobile terminal 100 from a user body upwards is entered and that a gesture input speed is greater than or equal to a set speed, the controller 180 can begin playing a next music file (or a previous music file) in response to the gesture input. On the contrary, during the playback of a musical file, as in the example shown in FIG. 13B (b), if a gesture consisting of shaking a hand carrying the mobile terminal 100 towards a user body towards the is input and a gesture input speed is greater than or equal to a set speed, the controller 180 can begin playing a previous music file (or a next music file) in response to the gesture input. [0035] While an application (for example, a music application, a video application, etc.) for playing a media file (for example, a music file, a video file) is executed, if a second type gesture is entered, the controller 180 can perform a read order swap operation. [0036] FIG. 14 is a diagram of an exemplary description of an operation of the mobile terminal 100 if a gesture of the second type is entered during the execution of a musical application. If we look at Figure 14, while several music files are set to be played in order, if a gesture of shaking a hand carrying the mobile terminal 100 is entered [Figure 14 (a)], the controller 180 may change a music file play command in response to the gesture input [figure 14 (b)]. In response to a gesture input, the controller 180 may play a media file or stop (or pause) a media file playback. [0037] Fig. 15 is a diagram of an exemplary operation of the mobile terminal 100 if a second type gesture is entered. As in the example shown in Fig. 15 (a), if a gesture of stretching all the fingers of a hand carrying the mobile terminal 100 is inputted, the controller 180 can control the start of playback of a multimedia file ( for example, a music file, a video file, etc.) in response to the gesture input. As in the example shown in Fig. 15 (b), if a gesture of folding all the fingers of a hand carrying the mobile terminal 100 is entered when playing a multimedia file, the controller 180 can control the stopping (or pausing) a playback of the media file in response to the gesture input. [0038] In the examples shown in Fig. 13A and Fig. 13B, an action of shaking a hand of a user's body outwards or an action of shaking a hand toward a user's body towards It is intended to change a playback time (ie, fast forward or rewind) or to change a media file during playback. In addition, depending on a speed of movement of a hand, it is possible to determine if it is the time of reading a multimedia file or a multimedia file that must be changed. However, the present invention is not limited by the gesture inputs shown in Fig. 13A and Fig. 13B. Moreover, a gesture different from that of the example shown in FIG. 13A or FIG. 13B can be assigned for a change of time of reading of multimedia file or a change of multimedia file. For example, Fig. 16 is a diagram of another example of changing a media file playback time. [0039] If we look at FIG. 16, while the mobile terminal 100 is worn on a wrist, if the wrist is moved so as to draw a circle clockwise by uniformly maintaining the direction of orientation of the unit of display (ie, the display unit 151 is held in an upward orientation), as in the example shown in Fig. 16 (a), the controller 180 can perform a fast forward action (or a fast return action). On the contrary, while the mobile terminal 100 is worn on a wrist, if the wrist is moved so as to draw a circle in the counter-clockwise direction by uniformly maintaining a direction of orientation of the display unit ( i.e., the display unit 151 is held in an upward orientation), as in the example shown in Fig. 16 (-b), the controller 180 can perform a fast return action (or an action of 'fast forward). Moreover, although FIG. 14 shows that an action consisting of shaking a hand is a gesture entry to switch a listening order, another gesture different from that shown in FIG. 14 can be attributed to a gesture entry. to swap a listening order. Moreover, it goes without saying that a gesture different from that shown in FIG. 15 can be attributed to a gesture input for setting a reading of a multimedia file. [0040] As mentioned in the preceding description, if a gesture of the first type is entered, depending on the presence or absence of the non-verification of an event by a user, the controller 180 can change a mode of operation of the mobile terminal 100 ( for example, enter a play mode to display detailed information, switch to an input mode to enter text from a play mode, switch to a call mode to send a call from a playback mode, etc.) or can execute an application of the mobile terminal 100. If a second type gesture is entered, the controller 180 can control the execution of an appropriate command operation while maintaining a current content. display (ie, by keeping a detailed information display on an event, by keeping an application running, etc.). Unlike the example mentioned in the preceding description, when an event occurs, even if a second type gesture is entered, the mobile terminal 100 according to the present invention can display detailed information about the event. When detailed information about the event is displayed in response to the second type gesture, it is possible to prevent a user from entering a first type gesture repeatedly. For example, while the mobile terminal 100 is operating in power saving mode, if a first type gesture is inputted, as mentioned in the preceding description with reference to FIG. 5, the controller 180 can cancel the mode of operation. Thus, in the presence of an event not verified by a user in the mobile terminal 100, an indicator indicating that the event has occurred can be produced jointly with 10 of the mobile terminal 100. current time information as soon as the power saving mode of the mobile terminal 100 is canceled. While the indicator is produced, if a user tries to check a detailed event content, as mentioned in the previous description with reference to Figure 6 (a), it may be necessary to re-enter the gesture. of first type. Thus, once the mobile terminal 100 has entered the energy saving mode, it may be necessary for the user to repeatedly enter the first type gesture twice in order to check the detailed contents of the the event. However, since the moving distance of the mobile terminal 100 is relatively long in the case of the first type gesture, as in the previous examples shown in FIG. 3A and FIG. 3B, if the user is led to enter the gesture of the first type consecutively, this may present a disadvantage for the user. Therefore, when an event occurs, even if a second type gesture is entered, the mobile terminal 100 according to the present invention may be able to display detailed information about the event. For example, Fig. 17A and Fig. 17B are diagrams of an example of displaying detailed information about an event if a second type gesture is entered. In the presence of an event that has not been verified, as in the example shown in Fig. 17A (a), the controller 180 can control the production of an indicator indicating an occurrence of an event. Then, as in the example shown in Fig. 17 (b), if a gesture of once-swinging a hand carrying the mobile terminal 100 is entered, the controller 180 may control the display of detailed event information. . The controller 180 may control the production of a result of converting detailed event information to audio or sound while displaying the detailed information of the event. For example, while an indicator indicating an occurrence of an event is produced [FIG. 17B (a)], as in the example shown in Fig. 17B (b), if a gesture of twisting a hand carrying the mobile terminal 100 is entered twice, the controller 180 can control the production of a Result of the conversion of detailed information about the event into sound or audio while displaying the detailed information of the event. Fig. 17A or Fig. 17B shows an example of displaying detailed information about an event in response to a gesture of pivoting of a hand, to which a gesture entry to display detailed information about the event is not limited. Because of the foregoing, while detailed information about an event is displayed, if a second type gesture is entered, the mobile terminal 100 according to the present invention can change a mode of operation of the mobile terminal 100. When the operation of the mobile terminal 100 is switched to call mode or input mode by the second type gesture, it can prevent a user from repeatedly entering a gesture of the first type. For example, Fig. 18 is a diagram of an example of a change in operating mode of the mobile terminal 100 if a second type gesture is entered. If we look at Figure 18 (a), while detailed information about an event is displayed, if a gesture entry consisting of two fingers being touched once (for example, an inch and an index) is received, the controller 180 can make a phone call to a peer that has sent a message. If we look at Fig. 18 (b), while detailed information of an event is displayed, if a gesture entry consisting of two fingers being contacted twice is received, the controller 180 may control the display of the event. a message composition screen to compose a response. FIG. 18 shows an example in which a mode of operation of the mobile terminal 100 is changed in response to a gesture of contacting two fingers, to which a gesture entry to change a mode of operation of the mobile terminal 100 is not limited. When an incoming call is received, the mobile terminal 100 according to the present invention can answer the incoming call or refuse it based on a gesture entry of the user. For example, when receiving an incoming call, if a first type gesture is entered, the controller 180 may respond to the incoming call. Upon receiving an incoming call, if a second type gesture is entered, the controller 180 may reject the incoming call. Figs. 19A-19C are exemplary diagrams of an operation of the mobile terminal 100 if gesture input is received upon receipt of an incoming call. As in the example shown in Fig. 19A (a), when an incoming call is received, the controller 180 may control the production of information (e.g., information about a peer having placed the telephone call, etc. .), which indicates that the incoming call is received, and a feedback (for example, vibration, sound, etc.). While the incoming call is received, if a gesture of lifting a hand carrying the mobile terminal 100 to a user's body is entered, as in the example shown in Figure 19A (b), the controller 180 can respond. to the incoming call. On the other hand, upon receipt of an incoming call, if a gesture of pivoting a hand carrying the mobile terminal 100 is entered, as in the example shown in Fig. 19B (b), the controller 180 may refuse the incoming call. When the incoming call is refused, while the presence of a missed call is notified, if a first type of gesture (for example, a gesture of lifting the hand carrying the mobile terminal 100 to a user's body, etc.) is input, as in the example shown in Fig. 19B (c), the controller 180 can control the display of detailed information about the refused phone call. In this case, the detailed information about the denied telephone call may include at least one of a peer name having passed the telephone call, a telephone number of the peer having placed the call. phone and information about the time the incoming call was received. When the controller 100 refuses the incoming call, the controller 180 can automatically send a message indicating that a user is unable to answer the phone call to the peer who made the phone call. By way of illustration, as in the example shown in Fig. 19C (a), when the incoming call is received, if a gesture of pivoting a hand carrying the mobile terminal 100 is entered twice, as in In the example shown in Fig. 19C (b), the controller 180 can automatically send a message to a peer whose incoming call has been refused while refusing the phone call. Then, if a first type gesture (for example, a gesture of lifting the hand carrying the mobile terminal 100 to a body) is entered as in the example 10 shown in Fig. 19C (c), the controller 180 can control displaying a detailed content of the message that has been sent to the peer. For the sake of clarity of the descriptions with reference to Figs. 19A-19C, in response to a user gesture of lifting a hand toward a user body or gesture of pivoting a hand, an incoming call receives a response. or refused. However, it goes without saying that another gesture input different from that shown in one of the examples shown in FIGS. 19A-19C may be intended to answer or reject an incoming call. If an alarm generation condition is met, the controller 180 may produce an alarm such as vibration, sound, and the like. For example, if a current time is a predefined alarm time or a time at which a pre-recorded program starts, the controller 180 may control the production of at least one of a vibration, a sound and the like. . As mentioned in the preceding description with reference to FIG. 4, when an alarm generating condition is met, if a first type gesture is inputted, the controller 180 can control the display of a detailed content of an alarm occurrence. For example, when a current time is an alarm time, when an alarm is generated, if a first type gesture is entered, the controller 180 can control the production of detailed information, which includes at least one a name of a defined alarm and a defined alarm time. In another case, when a current time is a start time of a pre-recorded program, when an alarm is generated, if a first type gesture is entered, the controller 180 can control the production of information. detailed, which include at least one of a pre-recorded program name, a start time of a program and a start location of a program. When an alarm generation condition is met, if a second type gesture is inputted during the generation of an alarm, the controller 180 may terminate the production of the alarm or postpone the alarm so that it starts after a set time [alarm reminder]. For example, Fig. 20A and Fig. 20B are diagrams of an operation of a mobile terminal if a second type gesture is inputted to produce an alarm. [0041] If we look at FIG. 20A, when an alarm generation condition is satisfied, while an alarm is being generated [FIG. 20A (a)], if a gesture of oscillating the mobile terminal 100 is input, the controller 180 can control the shutdown of the alarm [Figure 20A (b)]. On the other hand, if we look at Figure 20B, when an alarm generation condition is satisfied, while an alarm is being generated [Figure 20B (a)], if a rotating gesture of the mobile terminal 100 is entered the controller 180 can delay the alarm output after a set time [FIG. 20B (b)]. FIG. 20A and FIG. 20B show control examples of an alarm production by means of a gesture of oscillation of an arm or a gesture of rotation of an arm, to which a control gesture of a Alarm production is not limited. For example, different types of gestures can be used to control an alarm production. On the basis of a user gesture input, the mobile terminal 100 according to the present invention can change the settings of the mobile terminal 100. In particular, the controller 180 is able to set a vibration mode or an output mode audio in response to a gesture entry. On the other hand, the controller 180 can adjust a volume or brightness of the display unit 151 in response to gesture input. For example, Fig. 21 is a diagram of an exemplary setting of a vibration mode or an audio output mode by a gesture input. As in the example shown in Fig. 21 (a) and Fig. 21 (b), while the display unit 151 of the mobile terminal 100 is covered, if a lifting gesture of the mobile terminal 100 upwards in front of a body is input, the controller 180 can control the switching of an audio output mode and a vibration mode. [0042] In the example shown in Fig. 21 (a) and Fig. 21 (b), an audio production mode of the mobile terminal 100 is switched to a vibration mode. Fig. 22 is a diagram of an example of setting a brightness of a display unit. [0043] If we look at FIG. 22, while the mobile tellninal 100 is worn on a wrist, if the wrist is moved to draw a circle while maintaining a direction of orientation of the display unit in a uniform manner (i.e. display unit 151 is held upwardly), as in the example shown in FIG. 22 (a), the controller 180 can control the increase (or reduction) of the brightness of the display. 151. On the other hand, while the mobile terminal 100 is worn on a wrist, if the wrist is moved to draw a circle counterclockwise while maintaining a direction of orientation of the unit in a uniform manner (ie, the display unit 151 is held upwardly), as in the example shown in the figure. 22 (b), the controller 180 can control the reduction (or increase) of the brightness of the display unit 151. Thus, as in the examples shown in Fig. 21 and Fig. 22, the controller 180 is adapted to adjust the settings of the mobile terminal in response to a gesture entry. However, the gesture entries shown in Fig. 21 and Fig. 22 are just proposed for the sake of clarity of the descriptions. In particular, it goes without saying that other gesture inputs different from those shown in the drawings can be applied to adjust the settings of the mobile unit 100. The mobile terminal 100 according to the present invention can unlock the display unit 151. by means of a gesture entry by a user. In particular, the controller 180 may unlock the display unit 151 by considering at least one of a finger motion, a finger shake count, and a location of a shaken finger. For example, Fig. 23 is a diagram of an example of unlocking the display unit 151. For the sake of clarity of the following description, consider an action of moving an index once and then stir a ring twice is defined as a password. If we look at Fig. 23, if the display unit 151 is locked, the controller 180 can set a state of the display unit 151 to a locked state until a password to unlock the display unit 151 is entered. In this case, the locked state means an authentication state of an authority presence for using the mobile terminal before accessing content stored in the mobile terminal 100 or an application installed in the mobile terminal 100. [0044] If a state of the display unit 151 is set to the locked state, as in the example shown in Fig. 23 (a), the controller 180 can control the display of a locked screen containing information of 'actual hour. While the state of the display unit 151 is set to the locked state, as in the example shown in Fig. 23 (b), if a gesture input of moving fingers is received, the controller 180 can determine whether the gesture entry corresponds to a predefined password. For example, if a gesture of stirring once an inch and then moving a ring finger twice is entered, the controller 180 determines that the gesture corresponding to the password is entered and is therefore able to unlock the display unit. Once the display unit 151 is unlocked, the controller 180 controls the display of information, which has been displayed before the display unit 151 enters the locked state or, as in the example shown in Fig. 23 (c), can control the display of a home screen. While the mobile terminal 100 is operating in a power saving mode or a state of the display unit 151 is set to a locked state, if an event occurs, the controller 180 may control the display a locked screen, which includes an indicator indicating current time information and an occurrence of the event. Thus, when a gesture input consisting of moving a finger (s) is received, if the gesture input received corresponds to a predefined password, the controller 180 unlocks the display unit 151 and is also able to control the display of detailed information about the event. The controller 180 can control the production of a result of converting the detailed information of the event into audio or sound as soon as it produces the detailed information of the event. [0045] Furthermore, the mobile terminal 100 according to the present invention defines a mode of the mobile terminal 100 to a power saving mode or can control the input of the mobile terminal 100 in a normal mode from the economy mode of energy, by means of a gesture input of the user. This is described in detail with reference to Fig. 24 as follows. Fig. 24 is a diagram of an example of switching a mode from a mobile terminal to a power saving mode. [0046] If we look at Figure 24, consider that an action consisting of moving one of the fingers twice consecutively is a gesture input for switching to energy saving mode. While the display unit 151 is on, if a gesture entry consisting of moving a finger twice randomly is received [FIG. 24 (a)], the controller 180 controls the input of the mobile terminal 100 in the d mode. energy saving and is able to control the extinction of the display unit 151 [Figure 24 (b)]. On the other hand, while the display unit 151 is off, as in the example shown in Fig. 24 (b), if a gesture entry consisting of moving a finger twice randomly is received, as in In the example shown in FIG. 24 (a), the controller 180 cancels the energy saving mode and is able to control the switching on of the display unit 151. In the event that an event occurs, the mobile terminal 100 according to the present invention can produce feedback in the form of at least one of a vibration and a sound. When the vibration is produced according to the occurrence of the event, if a user is shaking vigorously (for example, when a user is training), it is difficult for the user to detect the vibration. Therefore, if the controller 180 determines that the user is shaking vigorously, the controller 180 can control the automatic increase of a vibration producing force. In particular, the controller 180 can determine whether the user is moving vigorously based on a detection signal of the detection unit 140 such as a movement of the mobile terminal 100, a moving speed of the mobile terminal 100 and the like. . For example still, if a user is shaking vigorously, the mobile terminal 100 according to the present invention can control the output of a feedback for a predefined event by a user only. For example, as in the example shown in Fig. 25 (a), consider that a user has predefined that feedback should only be generated in case of incoming call during sport practice. Then, as in the example shown in Fig. 25 (b), while a user is shaking vigorously (for example, practicing a sport, etc.), in case of an incoming call which is a predefined event, the controller 180 produces feedback. If an event (for example, receiving a message, etc.) other than the predefined event occurs, the controller 180 may skip the feedback production. In the examples shown in Fig. 4 and Fig. 12, if a first type gesture is entered, it is possible to execute a defined application in response to the first type gesture entry. In addition, as mentioned in the preceding description with reference to Fig. 12, the defined application may include a more recently executed application or an application used more frequently. On the other hand, once the second type gesture has been entered, if a first type gesture is entered, the controller 180 can determine an application, which must be executed by the first type gesture, based on a type of the second type gesture, a second type of gesture input count, a second type of gesture input time, and the like. For example, Fig. 26A and Fig. 26B are diagrams illustrating an example of setting an application to execute in response to an input count of a second type gesture. As in the example shown in Fig. 26A, once a gesture of once rotating a hand carrying the mobile terminal 100 has been entered, if a gesture of lifting the hand to a body is entered, the controller 180 may control the execution of a first application (e.g., a music application in Fig. 26A). As in the example shown in Fig. 26B, once a gesture of twice rotating a hand carrying the mobile terminal 100 has been entered, if a gesture of lifting the hand toward a body is entered, the controller 180 may control the execution of a second application (e.g., a message application in Fig. 26B). As in the examples shown in Fig. 26A and Fig. 26B, the controller 180 can determine an application, which is to be executed, based on a second gesture input type before entering a gesture of a first type. Accordingly, the embodiments of the present invention provide various effects and / or features. [0047] First of all, the present invention proposes a mobile telininal and its control method, allowing an improvement of the comfort of the user. The present invention aims to provide in particular a mobile terminal and its control method, whereby the mobile terminal can be controlled remotely based on a gesture input of a user. In addition, the remote control by a user of a mobile terminal is facilitated, in such a way as a gesture entry to perform a content change or a mode change and a gesture entry to perform a transaction operation. general control without changing content or changing mode are distinguished from each other according to a type of a gesture entry. It will be understood by those skilled in the art that the present invention may be specified in another form (s) without departing from the spirit and scope of the invention. Moreover, the methods described above can be implemented on a medium recorded by a program in the form of processor readable codes. The processor-readable medium may include all kinds of recording devices in which processor-readable data is stored. Examples of such processor readable media may include read-only memory, RAM, CD-ROM, magnetic tapes, floppy disks, optical data storage elements and the like, for example, and also include carrier-type work (for example, transmission over the Internet). It will be understood by those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit and scope of the invention. Thus, it is understood that the present invention covers the modifications and variations of the present invention provided that they fall within the scope of the appended claims and their equivalents.
权利要求:
Claims (14) [0001] REVENDICATIONS1. A watch-type mobile terminal (200) comprising: a sensor (140); a display (151); and a controller (180) configured to: identify a first gesture entry based on an input received from the sensor, wherein the first gesture input is identified when the mobile telininal is moved or rotated; and cause the display to display detailed information about an event at the mobile terminal in response to the identification of the first gesture entry. 10 [0002] The watch-type mobile terminal (200) according to claim 1, wherein the controller (180) is further configured to: cause the display (151) to display a response screen when another of the first input of gesture is identified while detailed information about the event is displayed. [0003] The watch-type mobile terminal (200) according to claim 2, wherein the response screen comprises a message composition screen when the event is an incoming message. 20 [0004] The watch-type mobile terminal (200) according to claim 1, wherein the controller (180) is further configured to cause the display to display a response screen comprising a telephone call-pass screen in response to the request. identifying the first gesture entry when the event of an unanswered incoming telephone call exists. [0005] The watch-type mobile terminal (200) according to any one of claims 1 to 4, further comprising an audio output (152), wherein the controller (180) is further configured to: identify a second input of gesture based on the input received from the sensor while the detailed information about the event is displayed, causing the display (151) to display a scrolling portion of the detailed information, causing the display (151) to display the detailed information in a larger size or smaller size; or causing the audio output (152) to output the converted detailed information to voice audio. [0006] A watch-type mobile terminal (200) according to any one of claims 1 to 5, wherein if the mobile terminal event does not exist, the controller (180) is further configured to: bring a a multimedia playback application to be executed in response to the identification of the first gesture entry; and identifying a second gesture entry based on the input received from the sensor; stop playing a media file while playing the media playback application; change a playlist order of the media player application; or change a current playback time of the media file being played. [0007] A watch-type mobile terminal (200) according to any one of claims 1 to 6, wherein: if an incoming telephone call is received at the mobile terminal: the controller (180) is further configured to: accept incoming telephone call when the first gesture entry is identified; and rejecting the incoming telephone call when a second gesture entry is identified based on the input received from the sensor (140). [0008] A watch-type mobile terminal (200) according to claim 7, further comprising a communication unit (110), wherein the controller (180) is further configured to cause the communication unit (110) to send a message to a mobile terminal corresponding to the incoming telephone call when the incoming telephone call is refused. [0009] The watch-type mobile terminal (200) according to claim 8, wherein the controller (180) is further configured to cause the display to display the sent message in response to the identification of another first gesture input. once the incoming phone call has been denied. [0010] A watch-type mobile terminal (200) according to any one of claims 1 to 5, wherein: if the event is an alarm, the detailed information includes alarm information or planning information associated with the alarm. 'alarm ; and the controller (180) is further configured to turn off the alarm or to alarm the alarm when the second gesture input is identified based on the input received from the sensor (140). [0011] The watch-type mobile terminal (200) according to any one of claims 1 to 10, wherein the controller (180) is configured to cause the display (151) to display second information relating to the first information when a second gesture entry is identified based on the input received from the sensor while the first information is displayed. [0012] The watch-type mobile terminal (200) according to claim 11, wherein: the first gesture entry and the second gesture entry each comprise a movement of the mobile terminal, wherein the controller is further configured to distinguish between the first gesture entry and the second gesture entry based on a threshold distance of movement; or the first gesture entry and the second gesture entry each comprise a rotation of the mobile terminal, wherein the controller is further configured to distinguish between the first gesture entry and the second gesture entry based on a threshold degree of rotation. [0013] The watch-type mobile terminal (200) according to claim 1, wherein the controller (180) is further configured to: turn off the display while the mobile terminal is in a power saving mode; andterminate the energy saving mode when the first gesture entry is identified. [0014] A method of controlling a watch-type mobile terminal (200), the method comprising: identifying (S401) a first gesture entry based on an input received from a sensor, wherein the first gesture entry is identified when the mobile terminal is moved or rotated; and displaying (S405) detailed information relating to an event in response to the identification of the first gesture entry.
类似技术:
公开号 | 公开日 | 专利标题 FR3022367A1|2015-12-18| FR3021133B1|2019-08-30|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3031601B1|2019-08-30|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021134A1|2015-11-20|MOBILE TERMINAL EP3122024B1|2018-09-19|Mobile terminal and controlling method thereof FR3021766A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3026201A1|2016-03-25| FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021425A1|2015-11-27| FR3024786A1|2016-02-12|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME FR3021767A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME FR3021136A1|2015-11-20|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3046470B1|2019-11-08|MOBILE TERMINAL FR3021485A1|2015-11-27|MOBILE DEVICE AND METHOD OF CONTROLLING THE SAME US10015400B2|2018-07-03|Mobile terminal for capturing an image and associated image capturing method FR3039673A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3019665A1|2015-10-09| FR3021132A1|2015-11-20| FR3040221A1|2017-02-24| US20160054567A1|2016-02-25|Mobile terminal, glasses-type terminal, and mutual interworking method using screens thereof FR3042084B1|2019-11-08|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME US10261671B2|2019-04-16|Mobile terminal and method for controlling the same
同族专利:
公开号 | 公开日 EP2957986B1|2019-06-12| US10551929B2|2020-02-04| KR20150144668A|2015-12-28| CN105282316A|2016-01-27| CN105282316B|2019-02-15| EP2957986A1|2015-12-23| US20150362999A1|2015-12-17| FR3022367B1|2018-09-07|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 EP2194698A1|2008-12-04|2010-06-09|Samsung Electronics Co., Ltd.|Watch phone and method for handling an incoming call in the watch phone| US20130058197A1|2011-09-07|2013-03-07|Casio Computer Co., Ltd.|Electronic timepiece| US20140139454A1|2012-11-20|2014-05-22|Samsung Electronics Company, Ltd.|User Gesture Input to Wearable Electronic Device Involving Movement of Device| US20140160078A1|2012-12-10|2014-06-12|Samsung Electronics Co., Ltd.|Mobile device of bangle type, control method thereof, and ui display method| US6477117B1|2000-06-30|2002-11-05|International Business Machines Corporation|Alarm interface for a smart watch| US20060028429A1|2004-08-09|2006-02-09|International Business Machines Corporation|Controlling devices' behaviors via changes in their relative locations and positions| US8768648B2|2010-09-30|2014-07-01|Fitbit, Inc.|Selection of display power mode based on sensor data| US8532654B2|2011-11-09|2013-09-10|At&T Intellectual Property I, L.P.|Detecting subscriber input via a wireless communication device for initiating network provisioning of such device| US9141194B1|2012-01-04|2015-09-22|Google Inc.|Magnetometer-based gesture sensing with a wearable device| US20180032997A1|2012-10-09|2018-02-01|George A. Gordon|System, method, and computer program product for determining whether to prompt an action by a platform in connection with a mobile device| US9442570B2|2013-03-13|2016-09-13|Google Technology Holdings LLC|Method and system for gesture recognition| US20140267024A1|2013-03-15|2014-09-18|Eric Jeffrey Keller|Computing interface system| GB2526217B|2013-03-15|2020-11-04|Intel Corp|Mobile computing device technology and systems and methods utilizing the same| CN103217895B|2013-04-07|2016-01-20|上海与德通讯技术有限公司|Support electronic wristwatch and the electronics of gesture function| DE102013007250A1|2013-04-26|2014-10-30|Inodyn Newmedia Gmbh|Procedure for gesture control| CN103442129A|2013-08-09|2013-12-11|宇龙计算机通信科技有限公司|Interactive method and system between intelligent watch and mobile terminal| US20150065104A1|2013-08-30|2015-03-05|Blackberry Limited|Method and system for status notifications for conferences| US9407756B2|2013-09-28|2016-08-02|Intel Corporation|Notification acknowledgement in electronic devices| US9400634B2|2013-10-28|2016-07-26|Google Technology Holdings LLC|Systems and methods for communicating notifications and textual data associated with applications| KR20150049977A|2013-10-31|2015-05-08|엘지전자 주식회사|Digital device and method for controlling the same| CN103777752A|2013-11-02|2014-05-07|上海威璞电子科技有限公司|Gesture recognition device based on arm muscle current detection and motion sensor| US9367086B2|2013-12-10|2016-06-14|Atmel Corporation|Smart watch with adaptive touch screen| US9971412B2|2013-12-20|2018-05-15|Lenovo Pte. Ltd.|Enabling device features according to gesture input| US20150188861A1|2013-12-26|2015-07-02|Aaren Esplin|Mechanism for facilitating dynamic generation and transmission of canned responses on computing devices| US9921658B2|2014-02-06|2018-03-20|Sony Mobile Communications, Inc.|Device and method for detecting gestures on the skin|US10255566B2|2011-06-03|2019-04-09|Apple Inc.|Generating and processing task items that represent tasks to perform| CN113470640A|2013-02-07|2021-10-01|苹果公司|Voice trigger of digital assistant| US10170123B2|2014-05-30|2019-01-01|Apple Inc.|Intelligent assistant for home automation| US9715875B2|2014-05-30|2017-07-25|Apple Inc.|Reducing the need for manual start/end-pointing and trigger phrases| WO2015184186A1|2014-05-30|2015-12-03|Apple Inc.|Multi-command single utterance input method| US20160188290A1|2014-12-30|2016-06-30|Anhui Huami Information Technology Co., Ltd.|Method, device and system for pushing audio| US9886953B2|2015-03-08|2018-02-06|Apple Inc.|Virtual assistant activation| US10200824B2|2015-05-27|2019-02-05|Apple Inc.|Systems and methods for proactively identifying and surfacing relevant content on a touch-sensitive device| CN105095120A|2015-07-03|2015-11-25|惠州Tcl移动通信有限公司|Terminal control method and accessory device| US10747498B2|2015-09-08|2020-08-18|Apple Inc.|Zero latency digital assistant| KR20170054861A|2015-11-10|2017-05-18|삼성전자주식회사|Wearable device and control method thereof| US10761714B2|2015-11-23|2020-09-01|Google Llc|Recognizing gestures and updating display by coordinator| CN105867778B|2016-03-28|2020-01-31|联想有限公司|information processing method and electronic equipment| KR101687174B1|2016-04-29|2016-12-16|주식회사 조이펀|A message display method on virtual reality device according to event occurrence and the message display apparatus by using the same| US10638316B2|2016-05-25|2020-04-28|Intel Corporation|Wearable computer apparatus with same hand user authentication| CN106066695A|2016-06-06|2016-11-02|北京奇虎科技有限公司|A kind of application program progress control method, device and wearable device| US10586535B2|2016-06-10|2020-03-10|Apple Inc.|Intelligent digital assistant in a multi-tasking environment| DK201670540A1|2016-06-11|2018-01-08|Apple Inc|Application integration with a digital assistant| US10928918B2|2018-05-07|2021-02-23|Apple Inc.|Raise to speak| DK179822B1|2018-06-01|2019-07-12|Apple Inc.|Voice interaction at a primary device to access call functionality of a companion device| DK180639B1|2018-06-01|2021-11-04|Apple Inc|DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT| CN110620840A|2018-06-20|2019-12-27|阿里巴巴集团控股有限公司|Call processing method, device, terminal equipment and storage medium| CN108965584A|2018-06-21|2018-12-07|北京百度网讯科技有限公司|A kind of processing method of voice messaging, device, terminal and storage medium| US11099647B2|2018-08-05|2021-08-24|Pison Technology, Inc.|User interface control of responsive devices| US10802598B2|2018-08-05|2020-10-13|Pison Technology, Inc.|User interface control of responsive devices| US11169612B2|2018-11-27|2021-11-09|International Business Machines Corporation|Wearable device control| KR20200112378A|2019-03-22|2020-10-05|삼성전자주식회사|Electronic device having two display surfaces and method for operating a display thereof| CN110083208A|2019-04-29|2019-08-02|努比亚技术有限公司|A kind of overturning control method, equipment and computer readable storage medium| DK180129B1|2019-05-31|2020-06-02|Apple Inc.|User activity shortcut suggestions| US11199908B2|2020-01-28|2021-12-14|Pison Technology, Inc.|Wrist-worn device-based inputs for an operating system| US11157086B2|2020-01-28|2021-10-26|Pison Technology, Inc.|Determining a geographical location based on human gestures|
法律状态:
2016-05-30| PLFP| Fee payment|Year of fee payment: 2 | 2016-07-08| PLSC| Publication of the preliminary search report|Effective date: 20160708 | 2017-05-30| PLFP| Fee payment|Year of fee payment: 3 | 2018-05-29| PLFP| Fee payment|Year of fee payment: 4 | 2019-04-10| PLFP| Fee payment|Year of fee payment: 5 | 2020-04-08| PLFP| Fee payment|Year of fee payment: 6 | 2021-04-09| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 KR1020140073818A|KR20150144668A|2014-06-17|2014-06-17|Mobile terminal and method for controlling the same| KR1020140073818|2014-06-17| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|