![]()
专利摘要:
A mobile terminal (100) capable of implementing a permanently lit function with low power, as a function of motion, and a related motion-based method of low power processing are described. The method includes sensing (S 100) a user's movement using an acceleration sensor in a permanently lit, low power state, in which only the acceleration sensor is activated; the selective activation (S 110) of a permanently on function according to the detected movement and the deactivation (S210) of the function permanently on when no movement is detected in a permanently on state and the automatic input in the function always on, at low power. 公开号:FR3026202A1 申请号:FR1554555 申请日:2015-05-21 公开日:2016-03-25 发明作者:Byoungeul Kim;Hyunjin Kim;Chohee Kwon;Jongbeom Kim;Hyunwoo Kim 申请人:LG Electronics Inc; IPC主号:
专利说明:
[0001] The present invention relates to a mobile terminal capable of reducing the electrical energy consumption of a permanently on function as a function of a movement (displacement), and a method of the invention. associated, based on the movement, implementation at low power. [0002] Terminals can generally be classified as mobile / portable terminals or stationary terminals depending on their mobility. Mobile terminals can also be classified as pocket terminals or on-vehicle terminals depending on whether a user can wear the terminal directly or not. [0003] Mobile terminals have become more and more functional. Examples of such functions include data and voice communications, image and video capture via a camera, audio recording, playback of music files via a speaker system, and displaying images and video on a display screen. [0004] Some mobile terminals include additional functionality that supports game execution, while other terminals are configured as media players. More recently, mobile terminals have been configured to receive broadcast and multicast signals that allow viewing of content such as videos and television programs. [0005] Since, recently, a mobile device has been provided with a permanently on function, various continuously lit features are implemented as basic functions of the mobile device. For this purpose, a sensor center is introduced to implement such characteristics permanently on. [0006] Once the permanently on function is activated, an acceleration sensor, a microcontroller (MCU), and an additional sensor maintain an "on" state so that input by a user can still be detected even in a turned off state. LCD. However, since the permanently lit function is under permanent excitation, a quiescent current of a predetermined amount (about 1.65 mA / hour) is continuously consumed by the acceleration sensor, the additional sensor and the MCU. Thus, the manner in which the permanently on function, which consumes current all the time with low power, is implemented can be a very important factor in determining a waiting time (passive availability time) for use. of the mobile device. In addition, in a portable device where a battery of a small capacity is mounted, the implementation of such a function permanently on with a low power can be more important. Therefore, an object of the present invention is to provide a mobile terminal 10 capable of minimizing the consumption of electrical energy by selectively exciting a permanently lit function according to a movement, and a related motion-based method of implementation at low power. Another object of the present invention is to provide a mobile terminal capable of reducing power consumption by automatically turning on or off a display via the recognition of a user's behavior, and an associated method, based on on the movement, implementation at low power. Yet another object of the present invention is to provide a mobile terminal capable of performing various types of functions with low power, depending on the movement of a user, and an associated motion-based method of implementation. at low power. To achieve these and other advantages and in accordance with the object of the present invention, as practiced and generally described herein, a low-power, motion-based process is provided. of a mobile terminal is provided, the method including: detecting a movement of a user by using an acceleration sensor in a permanently lit state, at low power, in which only the acceleration sensor is activated ; the selective activation of a permanently on function according to the detected movement and the automatic deactivation of the permanently on function, and the entry into the function permanently on, at low power, when no movement is detected in a constantly lit state. In one embodiment of the present invention, the step of selectively activating a permanently lit function includes: controlling a type of detected motion; if the type of motion detected is a movement opposite to take the mobile terminal to the face of a user, the detection of a face by temporarily exciting a camera; authentication of a user whose face is detected; and, when user authentication is complete, automatically turning on a screen and terminating a locked screen state. In another embodiment of the present invention, the step of selectively activating a permanently lit function may include: controlling a type of detected motion; if the controlled movement type is a facing movement to take the mobile terminal to a user's face, detecting a face by temporarily exciting a camera; authentication of a user whose face is detected; and when the user authentication is complete, displaying a watch image or notification information indicating receipt of an email and a call, in a screen-off state. In yet another embodiment of the present invention, the step of selectively activating a permanently lit function may include: controlling a type of motion based on a detection value detected by the sensor acceleration; and if the type of motion controlled is walking, displaying a security preview on one side of a screen. In one embodiment of the present invention, in the step of automatically disabling the permanently lit function, if the activity of a user is not detected while the amount of activity of a user is measured by a sensor, the sensor can be disabled to not measure the amount of user activity. In yet another embodiment of the present invention, the step of selectively activating a permanently lit feature may include: determining a sleep type based on the detected motion; and the control of a quality of sleep not to be measured if the determined type of sleep corresponds to a waking state, but to be measured if the determined type of sleep corresponds to a state of sleep. [0007] The motion-based, low-power processing method of a mobile terminal may further include the control of the continuously activated function activated according to a user's movement. In one embodiment of the present invention, the step of controlling the permanently lit function may include controlling a significant degree of measurement data at a user's current position, depending on a state continuous motion detected; and adaptive control of a period of environmental data measurement according to the important controlled degree of the measurement data. [0008] In another embodiment of the present invention, the step of controlling the permanently lit function may include, firstly, determining an off screen time according to whether an approaching object is detected or not in a lit screen state; and secondly, determining a screen time off depending on whether motion is detected or not if no approaching object is detected. The off screen time can be controlled to be the longest when motion is detected, and to be shortest when an approaching object is detected. In another embodiment of the present invention, the step of automatically disabling the permanently lit function may be a step of temporarily turning off the permanently lit function currently being executed if no motion occurs, and may include a Bluetooth function, a hand shake prevention function, and a GPS function. In yet another embodiment of the present invention, the step of automatically disabling the continuously lit function may include detecting a separate state of a glasses-type terminal; if the glasses-type terminal is separated, the detection of a movement of the glasses-type terminal; and if no movement of the glasses-type terminal occurs, the automatic deactivation of a spectacle function. In yet another embodiment of the present invention, the step of automatically disabling the permanently on function may include determining a wrist position of a user who wears a smart watch, depending on a detection value detected by the acceleration sensor; and, if the determined position of the wrist is not a position to see a screen, the automatic shutdown of the screen. To achieve these and other advantages and in accordance with the object of the present invention, as realized and generally described herein, a mobile terminal is also provided, including: a configured acceleration sensor for detecting a movement of a user in a permanently on, low power state in which only the acceleration sensor is activated; and a control apparatus configured to selectively activate a permanently lit function by being turned on / off in accordance with the detected motion, wherein the control apparatus activates a permanently on function when a user's motion is detected; and automatically disables the permanently lit function if no motion is detected and enters, from a permanently on state, to the permanently on, low power state. The controller may perform face detection and facial authentication by temporarily exciting a camera if the detected motion is a moving movement to take the mobile terminal to a face. If facial authentication is completed, the control device may automatically turn on a screen and may cease a locked screen state. The controller may perform face detection and facial authentication by temporarily exciting a camera if the detected motion is a moving movement to take the mobile terminal to a face. If facial authentication is completed, the controller may display a watch image or notification information indicating the receipt of an email and a call, in a screen-off state. The controller may control a type of motion based on a detection value detected by the acceleration sensor, and may display a security preview on one side of a screen if the type of motion being controlled is walking. . The controller may turn off the sensor to not measure the amount of user activity when a user's activity is not detected while the amount of user activity is measured by the user. a captor. [0009] The controller may determine a type of sleep according to the motion detected, and may control a quality of sleep not to be measured if the determined sleep type corresponds to a waking state, but to be measured if the type of sleep determined is a state of sleep. [0010] The controller can control a significant amount of measurement data at a user's current position, based on a continuous state of motion detected, and can adaptively control a period of environmental data measurement according to the important controlled degree of measurement data. [0011] The controller may, firstly, determine a screen time off depending on whether or not an approaching object is detected in a screen-on state, and second, determine a screen time off depending on the fact. whether a movement is detected or not if no approaching object is detected. The off screen time can be controlled to be the longest when motion is detected, and to be shortest when an approaching object is detected. The control unit can automatically switch off a spectacle function, if no movement of a telescope type terminal is detected, in a state separate from the glasses type terminal. The controller may determine a wrist position of a user wearing a smartwatch, based on a detection value detected by the acceleration sensor. If the determined position of the wrist is not a position to see a screen, the control unit can automatically turn off the screen. Fig. 1 is a block diagram illustrating a mobile terminal according to the present invention; Figs. 2A and 2B are conceptual views illustrating an example of a mobile terminal according to the present invention, which are viewed from different directions; Fig. 3 is a flowchart illustrating a low-power, motion-based method according to the present invention; Figs. 4A and 4B are views comparing an on state of the permanently on conventional function, to a selectively activated state of a permanently on function according to the present invention; Fig. 5 is a flowchart illustrating a motion-based, low power smart wakeup operation according to the present invention; Fig. 6 is a view illustrating a detailed embodiment of Fig. 5; Fig. 7 is a flowchart illustrating an operation for controlling additional information according to a motion according to the present invention; Fig. 8 is a view illustrating a detailed embodiment of Fig. 7; Fig. 9 is a view illustrating an example of an activity monitor screen displayed on a portable device; Fig. 10 is a flowchart illustrating a motion-based, low power activity monitoring method; Fig. 11 is a graph illustrating a general sleep state; Fig. 12 is a flowchart illustrating a first embodiment of motion-based low power sleep monitoring according to the present invention; Fig. 13 is a view illustrating a detailed embodiment of Fig. 12; Fig. 14 is a flowchart illustrating a second, motion-based, low power sleep monitoring embodiment according to the present invention; Fig. 15 is a view illustrating a detailed embodiment of Fig. 14; Fig. 16 is a view illustrating another embodiment for displaying sleep quality information; Fig. 17 is a view illustrating an embodiment for providing a sleep guide according to activity monitoring in accordance with the present invention; Fig. 18 is a view illustrating an example of an environmental sensor monitoring screen according to the present invention; Fig. 19 is a view illustrating an embodiment for monitoring motion-based low power environment data in accordance with the present invention; Fig. 20 is a flowchart illustrating an embodiment for controlling an off screen time, as a function of motion, according to the present invention; Fig. 21 is a flowchart illustrating an embodiment for terminating a connected state, according to a motion, according to the present invention; Figs. 22 to 24 are views illustrating a method of controlling various functions, according to a motion, according to the present invention; Fig. 25 is a view illustrating an embodiment for providing a low power pedestrian safety preview in accordance with a motion according to the present invention; Fig. 26 is a view illustrating a detailed embodiment of Fig. 25; Fig. 27 is a view illustrating an embodiment for automatically turning off a low power display, according to a motion, according to the present invention; and Fig. 28 is a view illustrating a change of an acceleration value according to a position of an arm. A description will now be made in detail according to illustrative embodiments described herein, with reference to the accompanying drawings. For the brief description with reference to the drawings, identical reference numbers may be assigned to the same or equivalent components, and the description of these components will not be repeated. In general, a suffix such as "module" and "unit" can be used to refer to elements or components. The use of such a suffix herein is merely intended to facilitate the description of the brief, and the suffix itself is not intended to give a special meaning or function. In the present invention, what is well known to those skilled in the relevant art has generally been omitted for brevity. The accompanying drawings are used to facilitate the understanding of various technical features and it is to be understood that the embodiments set forth herein are not limited by the accompanying drawings. As such, the present description should be construed as extending to any modifications, equivalents and substitutes, in addition to those which are particularly shown in the accompanying drawings. [0012] The mobile terminals exposed herein may be implemented using a variety of different types of terminals. Examples of such terminals include cell phones, smart phones, user equipment, laptops, digital broadcast terminals, personal digital assistants (PDAs), portable media players (PMPs), browsers, personal computers (PC), slate computers, tablet computers, ultra-portable computers, portable devices (eg, smart watches, smart glasses, helmets (HMD)), and the like. By way of nonlimiting example only, an additional description will be made by referring to particular types of mobile terminals. However, such teachings apply equally to other types of terminals, such as the types noted above. In addition, these teachings can also be applied to stationary terminals such as digital TV, desktops, and the like. Reference is now made to Figure 1 and Figures 2A and 2B, where Figure 1 is a block diagram of a mobile terminal in accordance with the present invention, and Figures 2A and 2B are conceptual views of an example. of the mobile terminal, seen from different directions. [0013] The mobile terminal 100 is shown having components such as a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an interface unit 160, a memory 170, a control apparatus 180, and an electric power supply unit 190. It is understood that the implementation of all of the illustrated components is not a necessity, and that more or fewer components can be implemented. alternatively. The mobile terminal 100 is shown having a wireless communication unit 110 configured with several components implemented in common. For example, the wireless communication unit 110 typically includes one or more components that enable wireless communication between the mobile terminal 100 and a wireless communication system or network within which the mobile terminal is located. [0014] The wireless communication unit 110 typically includes one or more modules that enable communications such as wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal, communications between the mobile terminal 100 and an external server. In addition, the wireless communication unit 110 typically includes one or more modules that connect the mobile terminal 100 to one or more networks. To facilitate such communications, the wireless communication unit 110 includes one or more of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114, and a location information module 115. The input unit 120 includes an image capture apparatus 121 for obtaining images or a video, a microphone 122, which is a type of audio input device for inputting a video. audio signal, and a user input unit 123 (for example, a touch key, a push button, a mechanical key, a programmable key, and the like) to allow a user to enter information. Data (e.g., audio, video, image, and the like) is obtained by the input unit 120 and can be analyzed and processed by the controller 180 according to device parameters, user commands, and associations of these. The detection unit 140 is typically implemented using one or more sensors configured to detect internal information of the mobile terminal, the near-mobile environment, user information, and the like. For example, the detection unit 140 is shown having a proximity sensor 141 and a lighting sensor 142. If desired, the detection unit 140 may alternatively or additionally include other types of sensors. or devices, such as a touch sensor, an acceleration sensor, a magnetic sensor, a gravity sensor, a gyro sensor, a motion sensor, an RGB sensor, an infrared (IR) sensor, a digital scanning sensor , an ultrasonic sensor, an optical sensor (for example, camera 121), a microphone 122, a battery meter, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a sensor). radiation detection, a thermal sensor, and a gas sensor, among others), and a chemical sensor (e.g., an electronic nose, a health sensor, a biometric sensor, and the like), to name a few. The mobile terminal 100 can be configured to use information obtained from the detection unit 140, and in particular, information obtained from one or more sensors of the detection unit 140, and associations of them. Output unit 150 is typically configured to output various types of information, such as audio, video, touch output, and the like. The output unit 150 is shown having a display unit 151, an audio output module 152, a haptic module 153, and an optical output module 154. The display unit 151 may have an internal structure. layer or integrated structure with a touch sensor to facilitate a touch screen. The touch screen may provide an output interface between the mobile terminal 100 and a user, as well as serve as a user input unit 123 which provides an input interface between the mobile terminal 100 and the user. The interface unit 160 serves as an interface with various types of external devices that can be coupled to the mobile terminal 100. The interface unit 160, for example, can include any of wired or wireless ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, input / output ports (1 / 0) audio, video 1/0 ports, headset ports, and the like. In some cases, the mobile terminal 100 may perform matching control functions associated with a connected external device, in response to the external device connected to the interface unit 160. The memory 170 is typically implemented to store data. to support various functions or features of the mobile terminal 100. For example, the memory 170 may be configured to store application programs executed in the mobile terminal 100, data or instructions for operations of the mobile terminal 100, and the like. Some of these application programs can be downloaded from an external server via wireless communication. Other application programs may be installed in the mobile terminal 100 during manufacturing or shipping, which is typically the case for basic functions of the mobile terminal 100 (for example, receiving a call). , making a call, receiving a message, sending a message, and the like). It is common for application programs to be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform an operation (or function) for the mobile timer 100. Control apparatus 180 typically serves to control the overall operation of the mobile terminal 100, in addition to the operations associated with the application programs. The controller 180 may provide or process information or functions appropriate for a user by processing signals, data, information and the like, which are inputted or outputted by the various components illustrated in FIG. Application programs stored in the memory 170. By way of example, the control apparatus 180 controls some or all of the components illustrated in FIG. 1 according to the execution of an application program that has been stored in the memory 170. [0015] The electrical power supply unit 190 may be configured to receive external electrical power or provide internal electrical power to provide appropriate electrical power required to operate components and components included in the system. Mobile terminal 100. The power supply unit 190 may include a battery, and the battery may be configured to be incorporated in the terminal body, or configured to be removable from the terminal body. At least some of the above components may operate cooperatively to implement an operation or method of controlling a goggle type terminal according to various embodiments to be explained below. The operation or the control method of the glasses-type terminal can be implemented on the glasses-type terminal by controlling at least one application program stored in the memory 170. Still referring to FIG. illustrated in this figure will now be described in more detail. With respect to the wireless communication unit 110, the broadcast receiving module 111 is typically configured to receive a broadcast signal and / or broadcast related information from an external broadcast management entity. via a broadcast channel. The broadcast channel may include a satellite channel, a terrestrial channel, or both. In some embodiments, two or more broadcast receiving modules 111 may be used to facilitate the simultaneous reception of two or more broadcast channels, or to support switching among broadcast channels. [0016] The broadcast management entity may be implemented using a server or system that generates and transmits a broadcast signal and / or broadcast related information, or a server that receives a broadcast signal and / or information previously generated broadcast associates, and sends such items to the mobile terminal. The broadcast signal may be implemented using any of a TV broadcast signal, a broadcast signal, a data broadcast signal, and combinations thereof, among others. The broadcast signal, in some cases, may further include a data broadcast signal associated with a TV broadcast or broadcast signal. The mobile communication module 112 may transmit, and / or receive, one or more wireless signals, and one or more network entities. Typical examples of a network entity include a base station, an external mobile terminal, a server, and the like. Such network entities are part of a mobile communication network, which is constructed according to technical standards or communication methods for mobile communications (eg, global mobile communication system (GSM), multiple access to mobile communication). code division multiple access (CDMA), code division multiple access (CDMA 2000), EV-DO, broadband code division multiple access (WCDMA), high-speed downlink packet access (HSDPA), access through high-throughput uplink packets (HSUPA), long-term evolution (LTE) technology, long-term advanced-evolution (LTE-A) technology, and the like). Examples of wireless signals transmitted and / or received via the mobile communication module 112 include audio call signals, video (telephony) call signals, or various data formats to support the communication of messages. texts and multimedia. The wireless Internet module 113 is configured to facilitate wireless Internet access. This module can be internally or externally coupled to the mobile terminal 100. The wireless Internet module 113 can transmit and / or receive wireless signals over communication networks using wireless Internet technologies. Examples of such wireless internet access include wireless local area network (WLAN), Wi-Fi, Wi-Fi Direct, DLNA, WiBro, WiMAX, HSDPA, HSUPA, LTE, LTE-A, and the like. The wireless Internet module 113 may transmit / receive data according to one or more of such wireless Internet technologies, and other Internet technologies as well. In some embodiments, when the wireless Internet access is implemented according to, for example, WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A and the like, as part of a mobile communication network, the wireless Internet module 113 makes such wireless Internet access. As such, the Internet module 113 may cooperate with, or serve as, a mobile communication module 112. The short-distance communication module 114 is configured to facilitate short-range communications. Appropriate technologies to implement such short-range communications include BLUETOOTHTm, Radio Frequency Identification (RFID), IrDA, Ultra Wide Band (UWB), ZigBee, Near Field Communication (NFC), Wi-Fi, Wi-Fi Direct, Serial Bus universal (USB) wireless, and the like. The short-distance communication module 114 generally supports wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal 100, or communications between the mobile terminal. and a network where another mobile terminal 100 (or an external server) is located, via wireless networks. An example of wireless networks is a wireless personal network. In some embodiments, another mobile terminal (which may be configured similarly to the mobile terminal 100) may be a portable device, for example, a smart watch, smart glasses, or a headset (HMD), which is capable of exchanging data with the mobile terminal 100 (or otherwise cooperating with the mobile terminal 100). The short-distance communication module 114 can detect or recognize the portable device, and allow communication between the portable device and the mobile terminal 100. In addition, when the detected portable device is a device that is authenticated to communicate with the mobile terminal 100, the controller 180, for example, may cause the processed data to be transmitted in the mobile terminal 100 to the portable device via the short-distance communication module 114. Thus, a user of the portable device may use the data processed in the mobile terminal 100 on the portable device. For example, when a call is received in the mobile terminal 100, the user can answer the call using the portable device. Also, when a message is received in the mobile terminal 100, the user can control the received message using the portable device. [0017] The location information module 115 is generally configured to detect, calculate, derive or otherwise identify a position of the mobile terminal. For example, the location information module 115 includes a GPS module, a Wi-Fi module, or both. If desired, the location information module 115 may alternatively or additionally operate with any of the other modules of the wireless communication unit 110 to obtain data related to the position of the mobile terminal. For example, when the mobile terminal uses a GPS module, a position of the mobile terminal can be acquired using a signal sent from a GPS satellite. As another example, when the mobile terminal uses the Wi-Fi module, a position of the mobile terminal can be acquired based on information related to a wireless access point (AP) which transmits, or receives, a wireless signal to, or from, the Wi-Fi module. The input unit 120 may be configured to allow various types of input to the mobile terminal 120. Examples of such an input include an audio input, image, video, data, and user. The image and video input is often obtained using one or more cameras 121. Such cameras 121 can process image frames of inanimate images or video obtained by image sensors. in a video mode or image capture. The processed image frames can be displayed on the display unit 151 or stored in the memory 170. In some cases, the cameras 121 can be arranged in a matrix configuration to allow a plurality of images to be displayed. images having various angles or focal points to be input to the mobile terminal 100. As another example, the cameras 121 may be located in a stereoscopic arrangement for acquiring left and right images to implement stereoscopic image. The microphone 122 is generally implemented to allow audio input into the mobile terminal 100. The audio input may be processed in a variety of ways according to a function performed in the mobile terminal 100. If desired, the microphone 122 may include matching noise elimination algorithms to eliminate unwanted noise generated during the reception of the external audio. The user input unit 123 is a component that allows input by a user. Such user input may enable the controller 180 to control the operation of the mobile terminal 100. The user input unit 123 may include one or more of a mechanical input element (for example, a key, a button located on a front and / or rear surface or a side surface of the mobile terminal 100, a curved switch, a pulse wheel, a pulse switch, and the like), or a touch input, among others. As an example, the touch input may be a virtual key or soft key, which is displayed on a touch screen via software processing, or a touch key that is located on the mobile terminal at a location that is other than the touch screen. On the other hand, the virtual key or the visual key can be displayed on the touch screen in various forms, for example, a graphic, a text, an icon, a video, or an association thereof. The detection unit 140 is generally configured to detect one or more of internal information of the mobile terminal, environment information close to the mobile terminal, user information, or the like. The controller 180 generally cooperates with the detection unit 140 to control the operation of the mobile terminal 100 or to execute a data processing, function or operation associated with an application program installed in the mobile terminal as a function of the detection provided by the detection unit 140. The detection unit 140 may be implemented using any of a variety of sensors, some of which will now be described in more detail. The proximity sensor 141 may include a sensor for detecting the presence or absence of an object approaching a surface, an object located near a surface, using an electromagnetic field, infrared rays, or analogs without mechanical contact. The proximity sensor 141 may be arranged on an inner region of the mobile terminal covered by the touch screen, or near the touch screen. [0018] The proximity sensor 141, for example, may include any of a transmissive type photoelectric sensor, a specular reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a proximity sensor to capacitance, a magnetic type proximity sensor, an infrared proximity sensor, and the like. When the touch screen is implemented in capacitance form, the proximity sensor 141 can detect the proximity of a pointer to the touch screen by changes in an electromagnetic field, which responds to an approach of an object with conductivity. In this case, the touch screen (touch sensor) can also be categorized as proximity sensor The term "touch in proximity" will often be used herein to denote the scenario in which a pointer is positioned to be close to the touch screen without touching the touch screen. The term "contact touch" will often be used herein to denote the scenario in which a pointer physically enters into contact with the touch screen. For the position corresponding to the touch in proximity of the pointer relative to the touch screen, such a position will correspond to a position where the pointer is perpendicular to the touch screen. Proximity sensor 141 can detect touch in proximity, and touch profiles in proximity (e.g., distance, direction, speed, time, position, motion status, and the like). [0019] In general, the controller 180 processes data corresponding to proximity touches and proximity touch profiles detected by the proximity sensor 141, and results in visual information output to the touch screen. In addition, the controller 180 may control the mobile terminal 100 to perform different operations or process different data depending on whether a touch with respect to a point on the touch screen is a touch in proximity or a touch to contact. A touch sensor may detect a touch applied to the touch screen, such as the display unit 151, using any of a variety of touch methods. Examples of such touch methods include a resistive type, a capacitive type, an infrared type, and a magnetic field type, among others. For example, the touch sensor may be configured to convert changes of a pressure applied to a specific portion of the display unit 151, or to convert a capacitance occurring on a specific portion of the display unit. display 151, as electrical input signals. The touch sensor can also be configured to detect not only an affected position and an affected area, but also tactile pressure and / or touch capacitance. A touch object is generally used to apply a touch input to the touch sensor. Examples of typical touch objects include a finger, a touch pen, a stylus, a pointer, or the like. When a touch input is detected by a touch sensor, corresponding signals can be transmitted to a touch control device. The touch controller can process the received signals, and then transmit corresponding data to the controller 180. Therefore, the controller 180 can detect the region of the display unit 151 which has been affected. Here, the touch control apparatus may be a separate component of the control apparatus 180, the control apparatus 180, and combinations thereof. In some embodiments, the controller 180 may perform the same or different commands depending on a type of touch object that touches the touch screen or a touch key provided in addition to the touch screen. The fact of executing the same command or a different command according to the object which provides a tactile input can be decided according to a current state of operation of the mobile terminal 100 or an application program currently executed, by 25. example. The touch sensor and the proximity sensor can be used individually, in combination, to detect various types of touch. Such touches include a short touch, a long touch, a multi-touch, a touch-and-shoot, a quick touch, a close-in pinch, a split-pinch, a touch-and-slide , a floating feel, and the like. If desired, an ultrasonic sensor may be implemented to recognize positional information relating to a touch object using ultrasonic waves. The controller 180, for example, can calculate a position of a wave generation source based on information detected by a lighting sensor and a plurality of ultrasonic sensors. Since light is much faster than ultrasonic waves, the time during which the light reaches the optical sensor is much shorter than the time during which the ultrasonic wave reaches the ultrasonic sensor. The position of the wave generation source can be calculated using this fact. For example, the position of the wave generation source can be calculated using the time difference from the time at which the ultrasonic wave reaches the sensor as a function of light as a reference signal. The camera 121 typically includes at least one camera sensor (CCD, CMOS etc.), a photo sensor (or image sensors), and a laser sensor. The implementation of the camera 121 with a laser sensor 15 may allow the detection of a touch of a physical object with respect to a 3D stereoscopic image. The photo-sensor can be laminated on, or overlapped with, the display device. The photo sensor can be configured to scan a movement of the physical object near the touch screen. In more detail, the photo-sensor may include photodiodes and phototransistors in rows and columns to scan a content received in the photo-sensor using an electrical signal that changes according to the amount of light applied. Namely, the photo-sensor can calculate the coordinates of the physical object according to a variation of light to thereby obtain position information of the physical object. The display unit 151 is generally configured to output information processed in the mobile terminal 100. For example, the display unit 151 may display run screen information of an executed application program in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information in response to the execution screen information. In some embodiments, the display unit 151 may be implemented as a stereoscopic display unit for displaying stereoscopic images. A typical stereoscopic display unit may use a stereoscopic display scheme such as a stereoscopic scheme (a glass scheme), an auto-stereoscopic scheme (without glass scheme), a projection scheme (holographic scheme), or the like . The audio output module 152 is generally configured to output audio data. Such audio data may be obtained from any one of a number of different sources, so that the audio data may be received from the wireless communication unit 110 or may be received from it. wireless communication unit 110 or may have been stored in memory 170. The audio data may be output during modes such as a signal reception mode, a call mode, a recording mode, a mode voice recognition, a broadcast reception mode, and the like. The audio output module 152 may provide an audible output related to a particular function (e.g., a call signal receiving sound, a message receiving sound, etc.) performed by the mobile terminal 100. Audio output 152 may also be implemented as a receiver, loudspeaker, horn, or the like. A haptic module 153 may be configured to generate various tactile effects that a user feels, perceives, or experiences otherwise. A typical example of a tactile effect generated by the haptic module 153 is vibration. The intensity, mode and the like of the vibration generated by the haptic module 153 can be controlled by user selection or setting by the control unit. For example, the haptic module 153 may output different vibrations in a combinatorial or sequential manner. In addition to the vibration, the haptic module 153 can generate various other tactile effects, including a pacing effect such as a pin-like arrangement moving vertically to contact the skin, a spraying force or force suctioning air through a jet orifice or suction aperture, a skin feel, an electrode contact, an electrostatic force, an effect by reproducing the sensation of cold and heat by using an element which can absorb or generate heat, and the like. The haptic module 153 can also be implemented to allow the user to feel a tactile effect through a muscular sensation, for example the fingers or the arm of the user, as well as the transfer of the tactile effect by direct contact. Two or more haptic modules 153 may be provided according to the particular configuration of the mobile terminal 100. An optical output module 154 may output a signal to indicate event generation using light from a light source. Examples of events generated in the mobile terminal 100 may include message reception, call waiting reception, a missed call, an alarm, a calendar note, an email reception, a reception of information via an application, and the like. A signal outputted from the optical output module 154 may be implemented in such a manner that the mobile terminal emits monochrome light or light with a plurality of colors. The output signal can be terminated when the mobile terminal detects that a user has controlled the generated event, for example. The interface unit 160 serves as an interface for external devices to be connected to the mobile terminal 100. For example, the interface unit 160 can receive data transmitted from an external device, receive data from the device, and receive data from the external device. electrical energy to be transferred to elements or components within the mobile terminal 100, or to transmit internal data of the mobile terminal 100 to such an external device. The interface unit 160 may include wired or wireless headset ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, audio input / output (1/0) ports, video 1/0 ports, headset ports, or the like. The identification module may be a chip that stores various information to authenticate the authority to use the mobile terminal 100 and may include a user identity module (UIM), a subscriber identity module ( SIM), a universal subscriber identity module (USIIVI), and the like. In addition, the device having the identification module (also referred to herein as "identification device") may take the form of a smart card. Therefore, the identification device can be connected with the terminal 100 via the interface unit 160. When the mobile terminal 100 is connected with an external docking station, the interface unit 160 may be used to allow electrical energy from the docking station to be supplied to the mobile terminal 100 or may be used as a gateway to allow various order signals input by the user from the docking station to be transferred to the mobile terminal through it. Various order signals or electrical energy input from the docking station can serve as signals to recognize that the mobile terminal is properly mounted on the docking station. The memory 170 may store programs to support operations of the controller 180 and store input / output data (eg, phone book, messages, inanimate images, videos, etc.). The memory 170 can store related data to various vibration and audio profiles that are output in response to touch inputs on the touch screen. The memory 170 may include one or more types of storage media, including a flash memory, a hard disk, a semiconductor disk, a silicon disk, a multimedia microcard, a card-type memory (for example, memory SD or DX, etc.), random access memory (RAM), static random access memory (SRAM), read only memory (ROM), erasable and electrically programmable read only memory (EEPROM), programmable read only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. The mobile terminal 100 may also be operated with respect to a network storage device which performs the storage function of the memory 170 via a network, such as the Internet. The controller 180 may typically control the general operations of the mobile terminal 100. For example, the controller 180 may set or terminate a lock state to prevent a user from entering a control command with respect to applications when a state of the mobile terminal 25 satisfies a pre-set condition. The controller 180 may also perform control and processing associated with voice calls, data communications, video calls, and the like, or perform pattern recognition processing to recognize a handwriting input or a handwritten input. drawing input made on the touch screen in the form of characters or images, respectively. In addition, the controller 180 may control a component or combination of these components to implement various illustrative embodiments described herein. [0020] The electric power supply unit 190 receives external electrical power or provides internal electrical power and provides the appropriate electrical power supply required to operate respective elements and components included in the mobile terminal 100. The power supply unit 190 may include a battery, which is typically rechargeable or removably coupled to the terminal body for charging. The power supply unit 190 may include a connection port. The connection port can be configured, for example, the interface unit 160 to which an external charger, to achieve the power supply to recharge the battery, is electrically connected. As another example, the power supply unit 190 may be configured to recharge the wireless battery without using the connection port. In the present example, the electric power supply unit 190 can receive electrical energy, transferred from an external wireless power transmitter, using at least one of an inductive coupling method, which is based on a magnetic induction, or a magnetic resonance coupling process, which is based on electromagnetic resonance. Various embodiments described herein may be implemented in a computer readable medium, a machine readable medium, or similar medium using, for example, software, hardware, or any combination thereof. Referring now to Figures 2A and 2B, the mobile terminal 100 is described with reference to a bar-type telluinal body. However, the mobile terminal 100 may alternatively be implemented in any of a variety of different configurations. Examples of such configurations include a watch type, a bar type, an eyewear type, or a foldable type, flap type, sliding type, toggle type, and swivel type in which two, and more, bodies are associated with each other in a relatively mobile manner, and associations thereof. The description herein will often refer to a particular type of mobile terminal (eg, a bar type, a watch type, a goggle type, and the like). However, such teachings with respect to a particular type of mobile terminal will generally apply to other types of mobile terminals as well. [0021] The mobile terminal 100 will generally include a housing (eg, frame, housing, cover, and the like) forming the appearance of the terminal. In this embodiment, the housing is formed using a front housing 101 and a rear housing 102. Various electronic components are incorporated in a gap formed between the front housing 101 and the rear housing 102. At least one middle housing can further be positioned between the front housing 101 and the rear housing 102. The display unit 151 is shown located on the front side of the terminal body to output information. As illustrated, a window 151a of the display unit 151 may be mounted on the front housing 101 to form the front surface of the terminal body together with the front housing 101. In some embodiments, electronic components can also be mounted on the back box 102. Examples of such electronic components include a removable battery 191, an identification module, a memory card, and the like. A rear cover 103 is shown covering the electronic components, and this cover can be detachably coupled to the rear housing 102. Thus, when the rear cover 103 is removed from the rear housing 102, the electronic components mounted on the rear housing 102 are exposed externally. As illustrated, when the rear cover 103 is coupled to the rear housing 102, a side surface of the rear housing 102 is partially exposed. In some cases, when mating, the rear housing 102 may also be completely protected by the rear cover 103. In some embodiments, the rear cover 103 may include an opening for externally exposing a camera 121b or an audio output module 152b. The housings 101, 102, 103 may be formed by injection molding a synthetic resin or may be formed of a metal, for example, stainless steel (STS), aluminum (Al), titanium (Ti), or the like. Instead of the example in which the plurality of housings form an interior space for housing components, the mobile terminal 100 may be configured such that a housing forms the interior space. In the present example, a mobile terminal 100 having a uni-body is formed such that a synthetic resin or metal extends from a side surface to a back surface. [0022] If desired, the mobile terminal 100 may include a water seal unit (not shown) to prevent the introduction of water into the terminal body. For example, the water seal unit may include a water seal element that is located between the window 151a and the front housing 101, between the front housing 101 and the rear housing 102, or between the rear housing 102 and the rear cover 103, for sealing an interior space when these housings are coupled. FIGS. 2A and 2B illustrate certain components as they are arranged on the mobile terminal. However, it is to be understood that other arrangements are possible and within the teachings of the present disclosure. Some components may be omitted or rearranged. For example, the first handling unit 123a may be located on another surface of the terminal body, and the second audio output module 152b may be located on the side surface of the terminal body. The display unit 151 is generally configured to output information processed in the mobile terminal 100. For example, the display unit 151 may display run screen information of an executed application program in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information in response to the execution screen information. The display unit 151 may be implemented using one or more suitable display devices. Examples of such suitable display devices include a liquid crystal display (LCD) screen, a thin film transistor (TFT-LCD) liquid crystal display, an organic light-emitting diode (OLED), a flexible screen, a three-dimensional (3D) screen, an electronic ink screen, and combinations thereof. The display unit 151 can be implemented using two display devices, which can implement the same technology or a different display technology. For example, a plurality of display units 151 may be arranged on one side, spaced from each other, or these devices may be integrated, or these devices may be arranged on different surfaces. The display unit 151 may also include a touch sensor that detects a touch input received on the display unit. When a touch is input to the display unit 151, the touch sensor can be configured to detect that touch and the controller 180, for example, can generate a control command or other signal corresponding to the touch. Content that is entered in a tactile manner can be a text or numeric value, or a menu item that can be embedded or designated in various modes. [0023] The touch sensor may be configured as a film having a touch profile, disposed between the window 151a and a display screen on a rear surface of the window 151a, or a wire which is patterned directly on the back surface of the window. the window 151a. Alternatively, the touch sensor may be formed integrally with the display screen. For example, the touch sensor may be disposed on a substrate of the display screen or within the display screen. The display unit 151 may also form a touch screen together with the touch sensor. Here, the touch screen can serve as a user input unit 123 (see Figure 1). Thus, the touch screen can replace at least some of the functions of the first handling unit 123a. [0024] The first audio output module 152a may be implemented as a speaker for outputting voice audio, alarm sounds, multimedia audio reproduction, and the like. The window 151a of the display unit 151 will typically include an aperture to allow audio generated by the first audio output module 152a to pass. Another possibility is to allow audio to be released along an assembly gap between the structural bodies (for example, a gap between the window 151a and the front housing 101). In this case, an independently formed hole for outputting audio sounds may not be seen or otherwise disguised in appearance, further simplifying the appearance and fabrication of the mobile terminal 100. The first capture device 121a can process image frames such as inanimate or animated images obtained by the image sensor in a capture mode or a video call mode. The processed image frames can then be displayed on the display unit 151 or stored in the memory 170. The first and second handling units 123a and 123b are examples of the user input unit 123, which may be manipulated by user to provide input to the mobile terminal 100. The first and second handling units 123a and 123b may also be commonly referred to as manipulative part, and may utilize any tactile method that allows the user to perform a manipulation such as touching, pushing, scrolling, or the like. The first and second handling units 123a and 123b may also use any non-tactile method that allows the user to perform manipulation, such as proximity touch, stationary float, or the like. Figures 2A and 2B illustrate the first handling unit 123a as a touch key, but other possibilities include a mechanical key, a push button, a touch key, and combinations thereof. [0025] An input received in the first and second handling units 123a and 123b can be used in various ways. For example, the first handling unit 123a can be used by the user to provide menu entry, a home key, a cancel, a search, or the like, and the second handling unit 123b can be used by the user for providing an input for controlling a volume level outputted from the first or second audio output module 152a or 152b, to switch to a touch recognition mode of the display unit 151, or the like. As another example of the user input unit 123, a rear input unit (not shown) may be located on the rear surface of the terminal body. [0026] The rear input unit may be manipulated by user to provide input to the mobile terminal 100. The input may be used in a variety of different ways. For example, the rear input unit may be used by the user to provide an input to turn on, turn off, start, end, scroll, control the volume level output from the first or second audio output module 152a or 152b, switch to a touch recognition mode of the display unit 151, and the like. The rear input unit may be configured to allow touch input, push input, or combinations thereof. The rear input unit may be located to overlap the display unit 151 on the front side in a thickness direction of the terminal body. For example, the rear input unit may be located on a rear end portion of the rear side of the terminal body so that a user can easily manipulate it using the index when the user grasps the terminal body with a hand. Alternatively, the rear input unit may be positioned at virtually any location on the rear side of the terminal body. Embodiments that include the rear input unit may implement some or all of the functionality of the first handling unit 123a in the rear input unit. As such, in situations where the first handling unit 123a is omitted from the front side, the display unit 151 may have a larger screen. As an additional possibility, the mobile terminal 100 may include a digital scanning sensor that scans a user's fingerprint. The controller 180 can then use fingerprint information detected by the digital scan sensor as part of an authentication procedure. The digital scanning sensor may also be installed in the display unit 151 or implemented in the user input unit 123. The second camera 121b is shown located on the rear side of the terminal body and includes an image capture direction that is substantially opposite to the image capture direction of the first camera unit 121a. The second camera 121b may include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix configuration. Cameras may be referred to as "networked cameras". When the second camera 121b is implemented as a networked camera, images can be captured in a variety of ways using the plurality of lenses and images with better qualities. [0027] A flash 124 is shown adjacent to the second 12 lb camera. When an image of a subject is captured with the camera 121b, the flash 124 may illuminate the subject. The second audio output module 152b may be located on the terminal body. The second audio output module 152b may implement stereophonic sound functions together with the first audio output module 152a, and may also be used to implement a speakerphone mode for call communication. [0028] At least one antenna for wireless communication may be located on the terminal body. The antenna can be installed in the terminal body or formed by the housing. For example, an antenna that configures a portion of the broadcast receiving module 111 may be retractable into the terminal body. Alternatively, an antenna may be formed using a film attached to an inner surface of the back cover 103, or a housing that includes a conductive material. An electrical power supply unit 190 for supplying electrical power to the mobile terminal 100 may include a battery 191, which is mounted in the terminal body or releasably coupled to an exterior of the terminal body. The battery 191 can receive electrical energy via an electrical power source cable connected to the interface unit 160. Also, the battery 191 can be wirelessly recharged using a wireless charger. . The wireless charge can be implemented by magnetic induction or electromagnetic resonance. [0029] The rear cover 103 is shown coupled to the rear housing 102 to protect the battery 191, to prevent separation of the battery 191, and to protect the battery 191 from external impact or foreign matter. When the battery 191 is removable from the terminal body, the rear housing 103 can be coupled to the rear housing 102 removably. [0030] An accessory for protecting an appearance or assisting or increasing the functions of the mobile terminal 100 may also be provided on the mobile terminal 100. As an example of an accessory, a cover or cover for covering or housing at least one surface of the mobile terminal 100 may be provided. The lid or pouch can cooperate with the display unit 151 to increase the function of the mobile terminal 100. Another example of the accessory is a touch pen to assist or increase a touch input on a touch screen. A communication system that is usable with the mobile terminal 100 described in various ways will now be described in more detail. Such a communication system may be configured to use any of a variety of different radio interfaces and / or physical layers. Examples of such radio interfaces used by the communication system include frequency division multiple access (FDMA), time division multiple access (TDMA), code division multiple access (CDMA), Universal Mobile Telecommunications System (UMTS) (including, Long Term Evolution Technology (LTE), Long Term-Advanced Evolution Technology (LTE-A)), Global Mobile Communications System (GSM), and the like. [0031] By way of non-limiting example only, a further description will relate to a CDMA communication system, but such teachings may also be applicable to other types of systems, including a wireless CDMA communication system as well as a system. Orthogonal Frequency Division Multiplexing (OFDM) wireless communication system. [0032] A wireless CDMA communication system generally includes one or more mobile terminals (MT or user equipment, UE) 100, one or more base stations (BSs, NodeB, or Advanced NodeB), one or more base station controllers (BSCs). ), and a mobile service switching center (MSC). The MSC is configured to interface with a conventional Public Switched Telephone Network (PSTN) and BSCs. BSCs are coupled to base stations via terrestrial trunks. The terrestrial trunks may be configured according to any of a number of known interfaces, including, for example, El / T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL. Thus, the plurality of BSCs may be included in the wireless CDMA communication system. Each base station may include one or more sectors, each sector having an omnidirectional antenna or an antenna oriented in a particular direction radially opposite the base station. Alternatively, each sector may include two or more different antennas. Each base station may be configured to support a plurality of frequency assignments, each frequency assignment having a particular spectrum (e.g., 1.25 MHz, 5 MHz, etc.). The intersection of sector and frequency assignment can be called CDMA channel. Base stations may also be called base station transceiver (BTS) subsystems. In some cases, the term "base station" may be used to collectively refer to a BSC, and one or more base stations. Base stations can also be denoted by "cell sites". Alternatively, individual sectors of a given base station may be called cell sites. A broadcast transceiver (BT) transmits a broadcast signal to the mobile terminals 100 operating within the system. The broadcast reception module 111 of FIG. 1 is typically configured inside the mobile terminal 100 to receive broadcast signals transmitted by the BT. Global Positioning System (GPS) satellites for locating the position of the mobile terminal 100, for example, can cooperate with the wireless CDMA communication system. Useful position information can be obtained with more or fewer satellites than two satellites. It should be appreciated that other types of position detection technology (ie, location technology that can be used in addition to, or instead of, GPS tracking technology) may alternatively be implemented. If desired, at least one of the GPS satellites may alternatively or additionally be configured to provide satellite DMB transmissions. The present invention provides a method capable of minimizing the consumption of electrical energy due to permanently on-duty functions, by activating the permanently on functions according to a movement of a user or device. [0033] The permanently lit feature includes various types of functions performed by a microcontroller (MCU) and additional sensors. For example, the permanently lit feature can be applied to a smart alarm clock to turn on an LCD, activity monitoring to monitor a user's activity, and sleep monitoring to measure sleep quality. The permanently lit function may also be applicable to a function set in a predetermined mode, as well as an environmental monitoring function, a hand shake prevention function, a GPS function and a screen off function. For this, in the present invention, the permanently lit function is selectively activated when motion is detected by an acceleration sensor. For example, in the present invention, if no motion is detected in an unlit state of LCD, a device performs a permanently on, low power function. On the other hand, if a motion is detected, the permanently lit, low power function is converted to a permanently lit, normal function. For example, only when a size of an output value of an acceleration sensor exceeds a predetermined threshold value, the permanently lit low power function can be converted to a permanently lit, normal function. Hereinafter, the permanently on, normal function is called the permanently on function for convenience. The acceleration sensor is configured to detect horizontal movement and vertical movement of the mobile terminal or wearable device, based on gravitational acceleration applied on X, Y, and Z axes. Movement may include motion of a user, the mobile terminal or a portable device. To detect motion, the acceleration sensor maintains an "on" state, even in a disabled state of a permanently on function (ie, in a permanently on, low power state). Fig. 3 is a flowchart illustrating a low-power, motion-based method according to the present invention. As shown in FIG. 3, a movement of a user or the mobile terminal is detected by an acceleration sensor (S100). If motion 20 is detected, a permanently lit function, set to be activated only when a predetermined condition is satisfied, is activated (S110). On the other hand, if no motion is detected, the permanently on function is not activated (S120). Namely, even if conditions to activate the permanently on function are satisfied, if a movement of a user or the mobile terminal is not detected, the permanently on function is not activated, but a function ( state) lit continuously, at low power, is performed. Thus, in contrast to conventional art, the permanently lit functions of the present invention are not always activated under predetermined conditions. Rather, the permanently lit functions of the present invention are selectively enabled depending on whether or not a motion has occurred. This can reduce the power consumption. That is, if motion is detected by an acceleration sensor, a microcontroller (MCU) and a predetermined sensor are operated by a detection signal. The MCU can be implemented in the form of a control device 180. FIGS. 4A and 4B are views comparing a method of activating a permanently lit function according to the conventional art to an activation method of a permanently lit function according to the present invention. As shown in FIG. 4A, the permanently on conventional function permanently maintains an activated state in an LCD off state, and the MCU and the acceleration sensor are under permanent excitation to detect a predetermined input (e.g. , a striking gesture). Additional sensors are operated to control additional features of the input. Thus, in the conventional art, an MCU and an acceleration sensor are under permanent excitation for a permanently on function. This can cause a predetermined amount of current to be consumed regardless of the movement of a user. On the other hand, in the present invention, as shown in Fig. 4B, the permanently on function is turned on / off according to a movement of a device, even in an unlit state of the LCD. In the present invention, when no motion is detected, a permanently low power function for exciting only an acceleration sensor is realized. This can reduce the consumption of electrical energy more than in conventional art where a permanently on function is performed. For example, in the conventional art, a quiescent current of +0.9 mA / hour is consumed by a permanently on function. However, in the present invention, only a quiescent current of +0.4 mA / hr is consumed when no motion is detected. This can reduce the power consumption by 1/4, considering that the lowest power consumption is 2 mA. For this purpose, a waiting time can be extended by 25%. Such selective, motion-based activation of the continuously on-duty function can be applied to various types of mobile terminal operations and a portable device requiring low power. For example, the selective, motion-based activation of the permanently on function can be applied to a smart wake-up function to turn on a screen by detecting a moment of use of the mobile terminal, an activity monitoring function to control the activity of a user type and calculate calories consumed and a total exercise metabolic rate according to the type of user activity, a sleep monitoring function to measure and display sleep quality etc. Hereinafter, various embodiments of a low-power, motion-based method according to the present invention will be explained in more detail with reference to the accompanying drawings. FIG. 5 is a flowchart illustrating a low power intelligent wakeup operation as an embodiment of a low-power, motion-based method according to the present invention, and FIG. a view illustrating a detailed embodiment of FIG. 5. As shown in FIGS. 5 and 6, if a movement of the mobile terminal is not detected in an unlit state of the LCD, a permanently lit function, at low power, is activated, while a permanently on function is deactivated (S200). In this state, if a movement of the mobile terminal occurs, the permanently on function is activated to turn on the MCU. The MCU recognizes the movement with respect to a user, namely, the gesture of a user to take the mobile terminal to his face, using a change in an acceleration value detected by the acceleration sensor ( an acceleration value applied on X, Y and Z axes) (S210). For example, when a user holds the mobile terminal, a large acceleration value is applied on the X and Y axes of the acceleration sensor. On the other hand, when a user is looking at a screen of the mobile terminal, a large acceleration value is applied on a Z axis of the acceleration sensor. Once the facing motion is detected, the MCU turns on only one camera before temporarily (eg, five seconds) (S220). Then, the MCU detects a face and recognizes (authenticates) a user by comparing the detected face to a registered face (S230). Thus, the looking movement can provide a precise instant use of the camera before, as the time of use of the mobile terminal. For face detection and facial recognition, a facial detection engine and a facial recognition engine may be used. [0034] Due to the steps (S220, S230), the power consumption of the mobile terminal can be minimized by temporarily limiting an operating time (for example, five seconds) of a corresponding device, in the case of operation. the specific device (eg, picture capture device, audio recorder, etc.) for final confirmation by a user or for authentication of a user with respect to the use of the mobile terminal. Once a user is recognized (authenticated), the LCD is automatically turned on, and a locked screen state is stopped (S240). Because of step (S240), complicated procedures (for example, turning on the LCD of the mobile terminal and terminating a locked-screen state) can be facilitated by turning on a screen and ceasing a locked state of the screen. a security screen, if necessary, after user authentication. The present invention is not limited to this. Namely, if a safety cessation process is not necessary, the LCD can be immediately turned on. As another embodiment, once a user's face is detected, the LCD can be immediately turned on, and then a safety-off process can be achieved through the face control of the user. user. Thus, in the present invention, once a facing movement is detected, the camera is activated for a short time to detect a face and to authenticate a user. Then the LCD is on. With such a configuration, the power consumption can be minimized, and a wakeup operation can be performed without the direct entry by the user on the mobile terminal (for example, the touch of a finger on a screen, 25 and the pressure of an ignition / home button). Fig. 7 is a flowchart illustrating an operation for controlling additional information according to a motion according to the present invention, and Fig. 8 is a view illustrating a detailed embodiment of Fig. 7. As shown in Figs. Figures 7 and 8, if no movement of the mobile terminal is detected in an unlit state of LCD, a permanently lit low power function is activated and a permanently on function is deactivated (S300). If a movement of the mobile terminal occurs in this state, the permanently on function is activated to turn on the MCU. The MCU recognizes the movement with respect to a user, namely, the gesture of a user to take the mobile terminal to his face, using a change in an acceleration value detected by the acceleration sensor (a acceleration value applied on X, Y and Z axes) (S310). Once the motion toward the user is detected, the MCU temporarily turns on the camera before the mobile terminal (eg, five seconds) (S320). Then, the MCU detects a face and recognizes (authenticates) a user by comparing the detected face to a registered face (S330). [0035] If user authentication is complete, the MCU displays an additional information screen on the display unit 151 in an unlit state of the LCD. The additional information screen may display a watch image and notification information (for example, receiving an email or a call). Thus, a user can easily control various types of additional information without turning on the LCD. Generally, an activity monitoring function indicates a function for automatically controlling and recording types of activity included in a user's schedule by a portable device worn on the wrist of a user (for example, a watch or a user's smartphone, to calculate calories consumed per activity and a total exercise metabolic rate for a day, to provide the calculated information to a user in return, and to recommend an exercise corresponding to the user. The activity monitoring function can be implemented by a pedometer. Fig. 9 is a view illustrating an example of an activity monitor screen displayed on a portable device. Referring to Fig. 9, the activity monitor screen is implemented as a circular timeline 50 indicating 24 hours. Icons indicating exercise types (the icons are displayed in the same color when they indicate the same exercise), a date, total calories, and the number of steps are displayed on an inner side of the circular timeline. Fig. 10 is a flowchart illustrating a method of monitoring low power activity as a function of motion. [0036] If no movement of a device is detected, this means that a user is not performing a vigorous (hard) activity to influence a metabolic rate of exercise. Thus, a permanently lit low power function is activated, and a permanently on function is deactivated (S400). If a user's movement (activity) is vigorous enough to influence a metabolic exercise rate (S410), the permanently on function is activated to turn on the MCU and an additional sensor (for example, bio-information sensor) (S420). The MCU determines a type of activity and a quantity of activity by collecting / sorting user activity data by a sensor center (S430), calculating calories consumed by activity and total calories according to the type of activity. activity and activity quantity determined (S440). In the present invention, an electric power consumption may be lower than in the conventional art, because necessary sensors are activated only when a user performs activity vigorous enough to influence a metabolic rate of exercise. A sleep monitoring function indicates a function for measuring the heart rate of a user while a user is sleeping, using a heartbeat sensor referred to as an optical plethysmography sensor (PPG), and for monitoring a quality of sleep. The PPG sensor is used to measure a heart rate by detecting light reflection that is variable according to a blood flow. The PPG sensor can provide sleep quality based on a heart rate, or can provide guide information for slow sleep. Figure 11 is a graph illustrating a general sleep state. Referring to Figure 11, a sleep state is generally classified into a slow sleep state (4 stages), a light sleep state (REM) and an awake state. In the present invention, contrary to conventional art, even if the mobile terminal enters a sleep monitoring mode (or sleep mode), the PPG sensor is selectively operated in a waking / sleeping state. by the movement of a user, to monitor a state of sleep. This can reduce the power consumption. [0037] The waking / sleeping state can be determined according to a change in an acceleration value applied to the X, Y, and Z axes of the acceleration sensor, namely, a wrist movement on which a portable device is worn. For example, if a change amount of an acceleration value is greater than a reference value, the mobile terminal is determined to be in an awake state. On the other hand, if a change amount of an acceleration value is smaller than the reference value, the mobile terminal is determined to be in a sleep state. Fig. 12 is a flowchart illustrating a first motion-based, low-power sleep monitoring embodiment 10 according to the present invention, and Fig. 13 is a view illustrating a detailed embodiment of Fig. 12. In this embodiment, even if a user sets a sleep mode, PPG detection is performed only in a basic sleep state (a slow sleep state or a REM state), to reduce the power consumption. . That is, a permanently lit function is not performed in an awake state, but is performed only in a sleep state so that a sleep state can be measured by the MCU and the PPG sensor. Referring to FIGS. 12 and 13, a permanently powered low power function is enabled and a permanently on function is disabled from a time when a user has started to operate a sleep monitoring mode. (for example, 23:00), at a time when the user has woken up. In this case, the acceleration sensor detects an acceleration value by a movement of a user carrying a portable device, and outputs the acceleration value. The MCU can be interrupted according to a size of the acceleration value detected by the acceleration sensor, thereby determining whether a user is currently in a waking state or a sleep state (S500). Namely, the acceleration value can be used as a sleeping interruption of the MCU. If a sleep interruption occurs when a user falls asleep at 11:30 pm, a permanently on feature to turn on the MCU is enabled. The MCU monitors the sleep state of a user by turning on the PPG sensor (S510). [0038] If a user wakes up momentarily during sleep monitoring, or if the user wakes up in the morning, an output value of the acceleration sensor is increased and thus a waking interrupt occurs. When the waking interrupt occurs, the MCU and the PPG sensor are turned off, so that a permanently on state is converted to a permanently on low power state (S520). In the present invention, once a sleep monitoring mode is set, the MCU is activated according to a motion in a waking state or sleep state, for PPG detection. In contrast, in the conventional art, the MCU is continuously enabled for PPG detection. Thus, in the present invention, the amount of current needed to monitor a sleep state can be reduced. Sleep quality, as well as a sleep state, can be determined based on a user's movement, ie, the output of the acceleration sensor. However, the sleep quality, determined only as a function of the acceleration sensor output, may not be accurate, since sleep quality is categorized in a plurality of stages, as shown in FIG. , PPG detection is used to improve the accuracy of the measurement of sleep quality. As mentioned above, it is disadvantageous in the energy saving aspect to continuously excite the MCU and the PPG sensor into a sleep state as in the conventional art. The reason is that the quality of sleep is not greatly influenced by the PPG sensor's excitation, since a motion rarely occurs in a slow sleep state where an output value of the acceleration sensor is almost the same. Thus, in the present invention, a permanently on function is disabled in a sleep state (permanently on, low power function is on). The permanently lit function is activated whenever a user's movement is detected, thereby exciting the PPG sensor. This can reduce the amount of current needed to perform sleep status monitoring. Fig. 14 is a flowchart illustrating a second, motion-based, low power sleep monitoring embodiment according to the present invention, and Fig. 15 is a view illustrating a detailed embodiment of Fig. 14. [0039] As shown in Figures 14 and 15, a user can select a sleep monitoring mode (or sleep mode) from a menu. Once the mobile terminal enters the sleep monitoring mode, a permanently on function is activated in an awake state, so that PPG detection is performed (S600). Then, if the awake state is converted to a sleep state, the permanently on function is disabled, so that a permanently low power function to energize only the acceleration sensor is activated (S610). That is, a permanently on steady state is converted to a permanently on, low power state. [0040] The acceleration sensor outputs an acceleration value based on a user's movement, and the PPG sensor is selectively energized when the MCU is enabled / disabled (triggered) according to the acceleration value. That is, a permanently powered, low power function is maintained in a slow sleep state where user movement rarely occurs. The permanently lit low power function is converted to a permanently on, normal function when a user's movement occurs, for the MCU to be activated and the PPG sensor to be turned on (S620, S630) . In another embodiment, an on / off period of the PPG sensor can be intelligently controlled, depending on a user's movement. For example, a detection period of the PPG sensor can be set to be long in a slow sleep state where the movement of a user rarely occurs. On the other hand, the detection period of the PPG sensor can be set to be short in a light sleep state where the movement of a user occurs a lot. [0041] Thus, the sleep state of a user is monitored according to an output value of the acceleration sensor, and as a function of a detection value of the PPG sensor (S640). Such an operation is performed repeatedly until the time when the sleep monitoring mode is completed, and sleep quality information is displayed on the screen when the sleep monitoring mode is completed (S650 , S660). Sleep quality information includes not only a user's sleep efficiency, but also a change in sleep status (awake, REM sleep, and slow sleep) depending on the passage of time. A user can precisely control a quality of sleep through detailed visualization. As mentioned above, in the present invention, PPG monitoring is performed while the MCU is enabled or disabled depending on a user's movement. Thus, in contrast to the conventional art where the MCU is continuously activated during a sleep state for PPG detection, the MCU is activated according to the movement of a user in the present invention. More specifically, in the present invention, an instant at which PPG detection is required is controlled according to an acceleration value, and PPG detection is performed only at this time. This can reduce the amount of current needed to monitor a sleep state. Fig. 16 is a view illustrating another embodiment for displaying information regarding sleep quality. As shown in Fig. 16, information regarding measured sleep quality can be displayed on a pop-up window 51. If a user selects sleep quality information from the pop-up window 51, information about detailed sleep can be displayed. In this state, if the user is dragging down the detailed sleep information, other information can be controlled. If the user is dragging up a portion rather than the pop-up window 51, information about all users is displayed in an assembled manner so that the user can control the information by a toggle operation. Fig. 17 is a view illustrating an embodiment for providing a sleep guide according to the activity monitoring according to the present invention. In the present invention, if it is determined (refer to FIG. 10) that the amount of activity of a user is greater than a mean value, a portable device can display the input, by the mobile terminal, in a sleep monitoring mode, on a screen, based on a controlled fatigue value during a day and a current PPG value, as shown in FIG. 17. If the input, by the mobile terminal in the sleep monitoring mode is controlled, a function permanently lit, at low power, is performed. If an awake state is detected based on an output value of the acceleration sensor while the permanently on function is performed, the MCU guides a solution for a slow sleep. If a user selects a specific solution (for example, music), the permanently on function is disabled (low power mode). As a result, music playback is performed while detection by the acceleration sensor is performed. Then, if the user wakes up in the morning, information about the quality of sleep can be displayed on the screen. Typically, an environmental sensor, a sensor for detecting data related to a user's health, such as temperature, humidity, atmospheric pressure and air quality, performs recording / monitoring. for 24 hours. However, in the conventional art, the environmental sensor collects environmental data at the same time, regardless of location and user movement (riding in a vehicle, walking, and activity in an outdoor environment). ). This can lead to continuous power consumption. Thus, if the energy of a plurality of sensors, activation of the MCU and an application processor (AP) for collecting data analysis, and a collection period of the environment sensor are determined by According to a movement, the current consumption can be minimized. Fig. 18 is a view illustrating an example of an environmental sensor monitor screen according to the present invention. As shown in Fig. 18, the environmental sensor monitor screen is implemented as a circular timeline 60 indicating 24 hours. Icons indicating environment types (the icons are displayed in the same color when they indicate the same environment) and a type of collected data are displayed on an inner side of the circular timeline 60. Once the icon is selected, a detailed detection value of the environmental sensor can be displayed. Fig. 19 is a view illustrating an embodiment for monitoring low-power, motion-based environment data in accordance with the present invention. [0042] Generally, environmental sensor data is more important at a location where a user remains for a long time without motion, whereas it is less important at a location where a user's movement (for example, riding in a vehicle, walking, and activity in the outdoor environment) occurs continuously. Thus, in the present invention, a data collection period of the environment sensor having a permanently on function is intelligently controlled according to a motion, thereby executing the recording function of the environment of a user with low power. That is, as shown in Fig. 19, a period for collecting environment data is increased to a location where a user remains for a long time without motion, since environment sensor data is more important at the location. (for example, 0.2 Hz - * 1 Hz). On the other hand, if a user or device movement occurs continuously, a period of time for collecting environmental data is reduced because environmental sensor data is less important and accuracy by the sensor environment is reduced (eg 0.1 Hz 0.03 Hz). [0043] In the present invention, the determination of the motion and a motion hold time is performed through a comparison to a preset reference value. In the present invention, a connection having an electrical power consumption (e.g., Bluetooth), a type of a device (e.g., a gyro sensor, a GPS and a camera), and various functions ( for example, screen off / screen lock) can be controlled according to a movement of a user or device. Fig. 20 is a flowchart illustrating an embodiment for controlling an off screen time according to a motion according to the present invention. As shown in Fig. 20, if no input has been applied for a predetermined time in a lit screen state, the controller 180 detects whether a predetermined object has moved closer to the mobile terminal or not. intermediate of the proximity sensor (S700, S710). If it is detected that a predetermined object has approached the mobile terminal, this means that the mobile terminal is upside down, or that the mobile terminal is positioned in a pocket or bag. Thus, the controller 180 controls an off screen time to be shorter than a normal time (S720). On the other hand, if no input has been applied for a predetermined time and no object has approached the mobile terminal, the controller 180 detects movement of the mobile terminal through the sensor. acceleration (S730). If a movement of the mobile terminal is detected, the controller 180 controls an off screen time to be longer than a normal time (S740). On the other hand, if no movement of the mobile terminal is detected, the controller 180 sets the off screen time as normal time (S750). If no input has been applied for a predetermined time, an off screen time is, firstly, controlled by the proximity sensor. If no object has approached the mobile terminal, a movement of the mobile terminal is determined for , secondly, control the screen time off. In such a configuration, the power consumption due to a lit screen state can be reduced when a screen is not used. In the present invention, only the control of an off screen time has been explained for convenience. However, control of a locked screen time can be achieved in the present invention. In the above embodiment, an off screen function and a locked screen function may be provided more flexibly as a motion-based, low-power process of present invention, is applied to the conventional screen off / screen locked function according to a proximity sensor. A mobile terminal (or smart phone) and a portable device (e.g., a smartwatch, a goggle type terminal) can be operated together with each other via Bluetooth. In this case, unnecessary power consumption can be caused because the Bluetooth connection is maintained continuously, even when the user is not wearing the portable device or the user is sleeping. Thus, if a state connected between the portable device and the user's smartphone is maintained or stopped when movement of the portable device is detected, unnecessary power consumption can be reduced. Fig. 21 is a flowchart illustrating an embodiment for terminating a connected state according to a motion according to the present invention. As shown in Fig. 21, in a state where a Bluetooth function is activated between the mobile terminal and a smart watch (S800), a control unit 180 of the smart watch detects movement of the smart watch for a period of time. predetermined using an acceleration sensor (S810). Next, the controller 180 determines whether a movement of the smart watch has occurred for a predetermined time or not (S820). If a movement of the smart watch is detected for a predetermined time, the controller 180 maintains the activated state ("on" state) of the Bluetooth function (S830). On the contrary, if no movement of the smart watch is detected for a predetermined time, the control unit 180 disables the Bluetooth function (S840). A case where a movement of the smart watch is detected for a predetermined time may include a case where a user is continuously moving, a case where a user stares at a point for a predetermined time (eg, eyeglass type terminal) , and a case where a user is temporarily in a stationary state (for example, a user wearing a smart watch on his wrist rests his chin on his hand). A case where no movement of the smart watch is detected for a predetermined time may include a case where a user wearing a smart watch on his wrist sleeps, or a case where a user does not wear a smart watch. If a user's finger is moved or a user expires at the press of a button to capture a subject, a camera is micro-moved. This can cause a movement of an image. In order to solve such an image movement, a hand shake prevention function has recently been developed. [0044] However, in conventional art, the following problem can be caused. In a state where the hand shake prevention function is set, if a user operates a camera using a tripod or a stationary location with no movement, the hand shake prevention function is performed . This can lead to unnecessary energy consumption. Thus, the present invention provides a method for selectively controlling "switch on / off" of a hand shake prevention function by detecting movement of the mobile terminal. [0045] Figs. 22-24 are views illustrating a method of controlling various functions according to a motion according to the present invention. Fig. 22 is a view illustrating an example for activating or deactivating a hand-shake prevention function according to a movement. As shown in FIG. 22, the control apparatus 180 controls the "on / off" of a hand shake prevention function by detecting, in a camera mode of capture, a movement of a user who is about to take a picture. Namely, the controller 180 determines, in a camera mode of capture, whether the mobile terminal is held by a user or fixed. If it is determined that the mobile terminal is in a fixed state, the controller 180 automatically disables the hand shake prevention function to reduce power. Thus, in the present invention, if a user performs an image capture function while holding the mobile terminal in his hand, or while the user is in a moving position (for example, inside of a moving vehicle, on a moving boat, etc.), the hand shake prevention function is activated. On the contrary, if a user performs an image capture function using a tripod or if the user is at a stationary location with no movement, the hand shake prevention function is automatically disabled. Fig. 23 is a view illustrating an example for reducing power consumption by reducing a GPS activation time by automatically turning on / off a GPS function based on the detection of a user's movement. [0046] As shown in Fig. 23, the controller 180 recognizes the movement of a user using the acceleration sensor. If the movement, greater than a predetermined size (for example, walking or running), of a user is detected, the controller 180 turns on a GPS function. [0047] On the contrary, if the movement, greater than a predetermined size, of a user is not detected or if a user is in a stationary state, the control unit 180 automatically turns off the GPS function. Fig. 24 is a flowchart illustrating an operation for turning on or off a spectacle function by detecting the movement of a user. [0048] Referring to Fig. 24, the controller 180 may determine a mounted state of a goggle-type terminal through a proximity sensor disposed on an inner side of an edge when a user removes the glasses type terminal (S900). If a user removes a glasses-type terminal, the controller 180 detects a user's movement using an acceleration sensor (S910). The reason is that it is difficult to determine if a user wishes to no longer use the glasses-type terminal simply because the user removes the glasses type terminal. For example, even if a user holds the glasses-type terminal for a period of time or has put the glasses-type terminal in a pocket, after having removed the glasses-type terminal, the user may wish to maintain a permanently lit function (for example). example, motion picture capture / dialogue recording). Thus, it is necessary to detect whether the movement of a user has occurred, and then it must be checked whether the user wishes to no longer use the glasses-type terminal (S920). [0049] If a user's movement has occurred, a spectacle function is maintained (S930). On the contrary, if the movement of a user has not occurred, it is determined that the user holds the glasses-type terminal. As a result, the glasses function is automatically disabled to reduce the power consumption (S940). [0050] FIG. 25 is a view illustrating an embodiment for providing a pedestrian safety preview as a function of motion, as an example of a motion-based, low-power processing method in accordance with FIG. present invention. Fig. 26 is a view illustrating a detailed embodiment of Fig. 25. The embodiment of Fig. 25 may be used consecutively to the function shown in Fig. 5, the function to automatically turn on an LCD and to terminate a state. locked screen. Namely, if a user uses the mobile terminal in a state where an LCD is turned on and a locked screen state is ceased, the controller 180 can determine whether or not the user is operating, based on a value of acceleration obtained by the acceleration sensor (S1000, S1010). For example, if a user is sitting or standing, an acceleration value is not significantly changed. However, if a user is running, an acceleration value is periodically increased and reduced. Thus, the controller 180 can control whether a user is running or not, depending on a change in an acceleration value. If a user is running, the camera is activated so that a security preview screen 70 can be displayed on the display unit 151 in the form of a pop-up screen (S1020). On the contrary, if a user does not work, the camera is disabled so that no security preview screen is provided. Thus, in the present invention, a small-sized camera-capturing device preview screen is provided according to the state of operation of a user, while the mobile terminal is used. This may allow the user to conveniently see the front side when using the mobile terminal while walking. In particular, in the present invention, the camera preview screen is provided only when a user walks, through the recognition of peripheral situations. This can enable the mobile terminal to save power consumption by turning on the camera only when necessary. Fig. 27 is a view illustrating an embodiment for automatically turning off a low power screen according to a motion according to the present invention, and Fig. 28 is a view illustrating a change of an acceleration value according to a position of an arm. As shown in Fig. 27, the control apparatus 180 of a smart watch can detect a movement of a user's arm (S1100). In the case of the smart watch, a movement of a user's arm can be divided into a state where a user is looking at a screen, and a state where a user is not looking at a screen. The state where a user does not look at a screen means a state where a user's arm is lowered. In this state, a large acceleration value is applied to the "X-Y" axes of the acceleration sensor. On the contrary, in the state where a user is looking at a screen, a large acceleration value is applied on a "Z" axis of the acceleration sensor. The controller 180 determines whether a user lowered his arm after viewing a screen, based on an acceleration value by the acceleration sensor (S1110). If it is determined that an acceleration value is increased in directions of the "XY" axes after having increased in a direction of the "Z" axis, the controller 180 recognizes that a user has decreased his arm after watching a screen. Thus, the control unit 180 automatically turns off a screen of the smart watch (S1120). In contrast, if an acceleration value is not increased in XY axis directions after increasing in a Z-axis direction, the controller 180 maintains an illuminated screen state of the smart watch. In the above embodiment, the screen is turned off faster than in a case where the screen is automatically turned off after a predetermined time has elapsed. This can allow the consumption of electrical energy to be reduced. The present invention may have the following advantages. Firstly, a permanently lit function performed in the mobile terminal, according to a movement of a user or the mobile terminal, is implemented with a low power. This can extend the life of a battery of the mobile terminal and a portable device. Secondly, since the screen is automatically turned on / off when the behavior of a user is recognized, a wakeup operation can be performed without the direct input by the user on the mobile terminal (for example, the contact of a finger on the screen, and the pressure of an ignition / home button). In addition, unnecessary power consumption due to a lit screen state can be reduced. Various embodiments may be implemented using a machine readable medium having instructions stored therein for execution by a processor to perform various methods set forth herein. Examples of possible machine readable media include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a read only memory (ROM), a random access memory (RAM), RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, the other types of storage media disclosed herein, and combinations thereof. If desired, the machine-readable medium can be embodied as a carrier wave (for example, a transmission over the Internet). The processor may include the controller 180 of the mobile terminal.
权利要求:
Claims (20) [0001] REVENDICATIONS1. A motion-based, low-power processing method of a mobile terminal (100), the method comprising: detecting (5100) a user's movement using an acceleration sensor in a state permanently on, at low power, in which only the acceleration sensor is activated; the selective activation (S110) of a permanently on function according to the detected movement, and the deactivation (S120) of the function permanently on when no motion is detected in a permanently on state, and the input Automatic 10 in the function always on, at low power. [0002] The method of claim 1, wherein the step of selectively activating a permanently on function includes: controlling a type of detected motion; Detecting a face by temporarily exciting an image capture apparatus if the detected motion type is a facing movement to take the mobile terminal (100) toward a face; authentication of a user whose face is detected; and automatically turning off a locked screen state after turning on a screen or displaying additional information, in a screen-off state, when the user authentication is complete, wherein the additional information includes a watch image or notification information indicating receipt of an email and a call. 25 [0003] The method of claim 1, wherein the step of selectively activating a permanently lit function includes: controlling a type of detected motion; and displaying a security preview on one side of a screen when the controlled movement type is walking. 30 [0004] 4. The method of claim 1, wherein in the step of automatically deactivating the permanently lit function, if the activity of a user is not detected while the amount of activity of a user. is measured by a sensor, the sensor is disabled to not measure the amount of activity of the user. [0005] The method of claim 1, wherein the step of selectively activating a permanently lit function includes: determining a sleep type based on the detected motion; and the control of a quality of sleep not to be measured if the determined type of sleep corresponds to a waking state, but to be measured if the determined type of sleep corresponds to a state of sleep. [0006] The method of claim 1, further comprising controlling the continuously activated function activated according to a user's movement, wherein the step of controlling the permanently lit function includes: controlling a significant degree of measurement data at the current position of a user, based on a continuous state of motion detected; and adaptive control of a period of environmental data measurement according to the important controlled degree of the measurement data. [0007] The method of claim 6, wherein the step of controlling the continuously lit function includes: first, determining a screen time off according to whether or not a near object is detected in a lit screen state; and secondly, determining a screen time off according to whether motion is detected or not if no approaching object is detected, wherein the off screen time is controlled to be the longest when movement is detected, and to be shortest when an approaching object is detected. [0008] The method of claim 1, wherein the step of automatically disabling the permanently on function includes temporarily turning off the permanently lit function currently being executed if no motion occurs, and in which the function is permanently on continuously. performed includes a Bluetooth function, a hand shake prevention function and a GPS function. [0009] The method of claim 1, wherein the step of automatically disabling the continuously lit function includes: detecting a separate state of a glasses type terminal; Detecting a movement of the glasses-type terminal when the glasses-type terminal is separated; and automatically stopping a spectacle function when no movement of the glasses-type terminal occurs. 15 [0010] The method of claim 1, wherein the step of automatically disabling the continuously lit feature includes: determining a wrist position of a user who wears a smart watch, based on a value of detection detected by the acceleration sensor; and automatically turning off the screen if the determined position of the wrist is not a position to view a screen. [0011] A mobile terminal (100), comprising: an acceleration sensor configured to detect a movement of a user in a permanently on, low power state, the acceleration sensor being activated in the permanently on state , at low power; and a control apparatus (180) configured to selectively activate a continuously lit function by being turned on / off in accordance with the detected motion, wherein the control apparatus (180) activates a permanently on function when a motion of a user is detected, and deactivates the permanently lit function if no motion is detected in a permanently on state and automatically enters the permanently lit state at low power. [0012] The mobile terminal (100) according to claim 11, wherein, if the detected motion is a moving movement to take the mobile terminal (100) to a face, the control apparatus (180) performs face detection and Facial authentication by temporarily exciting a camera, and wherein the controller (180) automatically ceases a locked screen state after turning on a screen and or displays additional information in a screen-off state when the Facial authentication is completed, wherein the additional information includes a watch image or notification information indicating the receipt of an email and a call. [0013] The mobile terminal (100) of claim 11, wherein the control apparatus (180) controls a type of motion based on a detection value detected by the acceleration sensor, and displays a security preview on one side of a screen if the type of motion controlled is walking. [0014] The mobile terminal (100) of claim 11, wherein the controller (180) turns off the sensor to not measure the amount of user activity when a user's activity is not detected while the amount of activity of a user is measured by a sensor. [0015] The mobile terminal (100) of claim 11, wherein the control apparatus (180) is further configured to; determine a type of sleep according to the motion detected, and measure a quality of sleep when the determined type of sleep corresponds to a state of sleep and not to measure the quality of sleep when the determined type of sleep corresponds to a waking state. [0016] The mobile terminal (100) according to claim 11, wherein the control apparatus (180) controls a large degree of measurement data at the current position of a user, depending on a continuous state of the detected motion, and adaptively controlling a period of environmental data measurement according to the important controlled degree of the measurement data. [0017] The mobile terminal (100) according to claim 16, wherein the control apparatus (180), firstly, determines an off screen time according to whether an approaching object is detected or not in a state to be detected. screen on, and, secondly, determines a screen time off depending on whether motion is detected or not if no approaching object is detected, and in which the off screen time is controlled to be the most long when motion is detected, and to be shortest when an approaching object is detected. [0018] The mobile terminal (100) according to claim 11, wherein the control apparatus (180) temporarily shuts down the permanently on light function currently being executed if no movement occurs, and wherein the permanently halted permanently on function includes a Bluetooth function, a hand shake prevention function and a GPS function. 20 [0019] The mobile terminal (100) according to claim 11, wherein the control apparatus (180) automatically deactivates a spectacle function, if no movement of a telescope type terminal is detected, in a state separated from the terminal glasses type. 25 [0020] The mobile terminal (100) according to claim 11, wherein the control apparatus (180) determines a wrist position of a user who is wearing a smart watch, based on a detection value detected by the sensor. acceleration, and automatically turns off the screen when the determined position of the wrist is not a position to see a screen.
类似技术:
公开号 | 公开日 | 专利标题 FR3026202A1|2016-03-25| FR3022368B1|2019-06-21|WATCH-TYPE TERMINAL AND CONTROL METHOD THEREOF CN105323378B|2020-08-18|Mobile terminal CN106850938B|2021-05-11|Mobile terminal and control method thereof US9602641B2|2017-03-21|Watch type mobile terminal CN105988708B|2020-06-26|Mobile terminal and control method thereof FR3021767A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME US20160357386A1|2016-12-08|Display device and controlling method thereof FR3026201A1|2016-03-25| FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL US20170014040A1|2017-01-19|Apparatus and method for measuring heartbeat/stress in mobile terminal FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3022362A1|2015-12-18| CN105718187B|2020-08-04|Mobile terminal and method of controlling content of mobile terminal FR3022367A1|2015-12-18| FR3021425A1|2015-11-27| CN106899801B|2020-01-14|Mobile terminal and control method thereof FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3042084B1|2019-11-08|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3039673A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3022648A1|2015-12-25| FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME EP3276479A1|2018-01-31|Mobile terminal and control method therefor FR3041447A1|2017-03-24| FR3022424A1|2015-12-18|GLASS-TYPE TERMINAL AND METHOD FOR CONTROLLING THE SAME
同族专利:
公开号 | 公开日 FR3026202B1|2021-09-10| US10028227B2|2018-07-17| US20170280394A1|2017-09-28| KR102276023B1|2021-07-12| KR20160034075A|2016-03-29| WO2016043405A1|2016-03-24|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 JP4611866B2|2005-11-02|2011-01-12|シャープ株式会社|Electronics| CN102111490A|2009-12-23|2011-06-29|索尼爱立信移动通讯有限公司|Method and device for automatically unlocking mobile terminal keyboard| US8994499B2|2011-03-16|2015-03-31|Apple Inc.|Locking and unlocking a mobile device using facial recognition| JP5103546B1|2011-07-11|2012-12-19|株式会社エヌ・ティ・ティ・ドコモ|Mobile terminal and continuous movement detection method| US9596084B2|2011-09-09|2017-03-14|Facebook, Inc.|Initializing camera subsystem for face detection based on sensor inputs| JP5792570B2|2011-09-21|2015-10-14|埼玉日本電気株式会社|Portable terminal device and program| JP6062175B2|2012-07-25|2017-01-18|京セラ株式会社|Portable terminal, power saving control program, and power saving control method| JP6138506B2|2013-02-12|2017-05-31|Necプラットフォームズ株式会社|Mobile communication terminal| US9661221B2|2013-03-15|2017-05-23|Qualcomm Incorporated|Always-on camera sampling strategies|KR20170010638A|2015-07-20|2017-02-01|삼성전자주식회사|Measuring method of signal and electronic device thereof| CN105573495B|2015-12-14|2020-06-23|联想有限公司|Information processing method and wearable device| CN108603385B|2016-01-21|2020-12-01|阿玛达斯株式会社|Input device of door lock| WO2017172993A1|2016-03-29|2017-10-05|Resolution Products, Inc.|Universal protocol translator| US10335045B2|2016-06-24|2019-07-02|Universita Degli Studi Di Trento|Self-adaptive matrix completion for heart rate estimation from face videos under realistic conditions| CN106453873B|2016-09-28|2019-07-23|北京小米移动软件有限公司|Information cuing method and device| KR20180064717A|2016-12-06|2018-06-15|삼성전자주식회사|Method for content output and electronic device supporting the same| KR20180068127A|2016-12-13|2018-06-21|엘지전자 주식회사|Mobile terminal and method for controlling the same| US11010595B2|2017-03-23|2021-05-18|Samsung Electronics Co., Ltd.|Facial verification method and apparatus| CN107341006B|2017-06-21|2020-04-21|Oppo广东移动通信有限公司|Screen locking wallpaper recommendation method and related products| KR20190021595A|2017-08-23|2019-03-06|삼성전자주식회사|Electronic device and method for controlling using the electronic device| KR102246777B1|2018-05-09|2021-04-30|삼성전자주식회사|Method for displaying content in the expandable screen area and electronic device supporting the same| US10659595B2|2018-10-22|2020-05-19|Motorola Mobility Llc|Determining orientation of a mobile device| CN109831817B|2019-02-27|2020-09-11|北京达佳互联信息技术有限公司|Terminal control method, device, terminal and storage medium| KR102141034B1|2019-04-18|2020-08-04|주식회사 아신테크윈|Location-able Hazardous Carriers| WO2020251593A1|2019-06-14|2020-12-17|Hewlett-Packard Development Company, L.P.|Setting device states based on modes| US10594856B1|2019-06-27|2020-03-17|Shenzhen GOODIX Technology Co., Ltd.|Immediate-mode camera for portable personal electronic devices|
法律状态:
2016-05-30| PLFP| Fee payment|Year of fee payment: 2 | 2017-05-30| PLFP| Fee payment|Year of fee payment: 3 | 2018-05-29| PLFP| Fee payment|Year of fee payment: 4 | 2019-03-27| PLFP| Fee payment|Year of fee payment: 5 | 2019-10-18| PLSC| Publication of the preliminary search report|Effective date: 20191018 | 2020-04-24| PLFP| Fee payment|Year of fee payment: 6 | 2021-04-30| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 KR1020140125142A|KR102276023B1|2014-09-19|2014-09-19|Mobile terminal and movemetn based low power implementing method thereof| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|