专利摘要:
An eyeglass type terminal and a control method thereof are provided. The glasses-type terminal includes: a mount unit (401) configured to be portable to a user's head; a display unit (451); and a control unit (480) configured to turn on the electrical power of an image acquisition device (421) when preset conditions are satisfied, to analyze an image captured by the image acquisition device (421). ) to produce image capture guide information, and to control the display unit (451) to output the image capture guide information produced.
公开号:FR3022424A1
申请号:FR1554973
申请日:2015-06-01
公开日:2015-12-18
发明作者:Yujune Jang;Taeseong Kim;Taekyoung Lee;Jeongyoon Rhee
申请人:LG Electronics Inc;
IPC主号:
专利说明:

[0001] The present description relates to a glasses-type terminal that can be worn on the head of a user, such as glasses, and an associated control method. The terminals can be generally classified as mobile / portable terminals or stationary terminals depending on their mobility. Mobile terminals can also be classified as pocket terminals or on-vehicle terminals depending on whether a user can directly carry the terminal or not. Mobile terminals have become more and more functional. Examples of such functions include data and voice communications, image and video capture via an image acquisition device, audio recording, playback of music files via a speaker system, and displaying images and video on a display screen. Some mobile terminals include an additional feature that supports the running of a game, while other terminals are configured as media players. More recently, mobile terminals have been configured to receive broadcast and multicast signals that allow viewing of content such as videos and television programs. As terminal functions become more diverse, terminals can support more complicated functions such as capturing images or video, playing music or video files, playing games, receiving broadcast signals, and the like. By implementing such functions in a complete and collective manner, the mobile terminal can be realized in the form of a reader or a multimedia device.
[0002] Efforts are underway to support and increase the functionality of mobile devices. Such efforts include software and hardware enhancements, as well as changes and improvements to the structural components. Recently, mobile terminals extend to portable devices that can be worn on the body of users, going beyond a simple use of the terminal by the hands of the user. An example of a portable device may be a glasses-type terminal.
[0003] In addition, for a glasses type terminal, an image identical to that seen by a user can be captured. However, when an image acquisition device of a goggle-type terminal is turned on constantly, the battery consumption may increase to generate heat.
[0004] Therefore, one aspect of the present invention is to address the problems noted above and others. Another aspect of the present invention is to provide an eyeglass-type terminal adapted to turn on the electrical energy of an image acquisition device when preset conditions are detected, to analyze an image captured by the acquisition device. image, and provide image capture guide information according to analysis results, and an associated control method. In order to achieve these and other advantages and in accordance with the purpose of the present description, as it is realized and here generally described, a goggle type terminal includes: a mount unit configured to be portable over the head of an user ; a display unit; and a control unit configured to turn on the electrical power of an image acquisition device when preset conditions are satisfied, for analyzing an image captured by the image acquisition device, to produce guide information image capture, and to control the display unit to output the image capture guide information produced. In an illustrative embodiment of the present description, the control unit can detect the preset conditions according to at least one of a touch input from the user, a voice input from the user, and body information of the user.
[0005] In an illustrative embodiment of the present description, when a user control command for tracking the output image-capturing guide information is inputted, the control unit can control the acquisition device of image to capture an image according to the image capture guide information outputs and store the captured image.
[0006] In an illustrative embodiment of the present description, when a motion of a subject is detected from the captured image by the image acquisition device, the control unit can output as a image capture guide information, the user's eye movement to follow the movement of the subject. In an illustrative embodiment of the present description, when motion of a subject is detected from the image captured by the image acquisition device, the control unit can continuously output the capture. of images or video using the image acquisition device according to the movement of the subject, as image capture guide information. In an illustrative embodiment of the present description, when a user control command for tracking the output image-capturing guide information is inputted, the control unit can control the acquisition device of image to rotate to continuously capture images or video. In order to achieve these and other advantages and in accordance with the purpose of the present disclosure, as practiced and herein generally described, a method for controlling a spectacle type terminal includes: (a) detecting preset conditions; (b) when it is detected that the motion or behavior of a particular user satisfies preset conditions, igniting the electrical power of an image acquisition device; (c) analyzing an image captured by the image acquisition device to produce image capture guide information; and (d) outputting the image capture guide information produced on the display unit. In an illustrative embodiment of the present description, the operation (a) may include: detecting the preset conditions according to at least one of a touch input from the user, a voice input from the user user, and bodily information of the user. In an illustrative embodiment of the present description, the method may further include: e) when a user command command for tracking the output image-capturing guide information is inputted, controlling the device image acquisition for capturing an image according to the image capture guide information outputs and providing a command for storing the captured image. In an illustrative embodiment of the present description, the operation (d) may include: when a motion of a subject is detected from the image captured by the image acquisition device, the output, as image capture guide information, the movement of the user's eyes to follow the movement of the subject. In an illustrative embodiment of the present description, step (d) may include: when motion of a subject is detected from the image captured by the image acquisition device, the output of images or video captured continuously by the image acquisition device according to the movement of the subject, as image capture guide information. In an illustrative embodiment of the present description, step (d) may include: when a user command command to track the output image capture guide information is entered, the rotation of the device image acquisition to continuously capture images or video. In addition, the scope of applicability of the present application will become more apparent from the detailed description provided hereinafter. However, it should be understood that the detailed description and the specific examples, while indicating preferred embodiments of the invention, are provided for illustrative purposes only. Various changes and modifications are obvious to those skilled in the art from reading the detailed description. The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of the present description, illustrate embodiments and, together with the description, serve to explain the principles of the invention. . In the drawings: FIG. 1A is a block diagram of a mobile terminal in accordance with the present description; FIGS. 1B and 1C are conceptual views of an example of the mobile terminal, viewed from different directions; FIG. 2 is a perspective view illustrating an example of an eye-type mobile terminal relating to another illustrative embodiment of the present description; Fig. 3 is a flowchart illustrating a method of operating a spectacle-type terminal according to an illustrative embodiment of the present disclosure; Fig. 4 is a conceptual view illustrating an illustrative embodiment in which the use of a flash is provided as image capture guide information; Fig. 5 is a conceptual view illustrating an illustrative embodiment wherein the use of an anti-vibration mode is provided as image capture guide information; Fig. 6 is a conceptual view illustrating an illustrative embodiment wherein the use of a horizontal composition is provided as image capture guide information; Figures 7 and 8 are conceptual views illustrating illustrative embodiments in which the use of a central composition is provided as image capture guide information; Figs. 9 and 10 are conceptual views illustrating illustrative embodiments in which the response of a user in motion of a subject is provided as image capture guide information; Fig. 11 is a conceptual view illustrating an illustrative embodiment in which a response of an image acquisition device according to a movement of a subject is provided as image capture guide information. A description will now be given in detail according to embodiments described herein, with reference to the accompanying drawings. To improve the clarity of the description while referring to the drawings, identical or similar reference numbers may be assigned to the same or equivalent components, and the description of these components will not be repeated. In general, a suffix such as "module" and "unit" may be used to refer to elements or components. The use of such a suffix herein is merely intended to facilitate the description of the invention, and the suffix itself is not intended to give a special meaning or function. In the present description, what is well known to those skilled in the art has generally been omitted for reasons of intelligibility of the subject. The accompanying drawings are used to facilitate the understanding of various technical features and it is to be understood that the embodiments set forth in the present application are not limited by the accompanying drawings. As such, the present disclosure is to be construed as extending to any modifications, any equivalents and substitutes in addition to those particularly shown in the accompanying drawings. The mobile terminals exposed herein may be implemented using a variety of different types of terminals. Examples of such terminals include cell phones, smart phones, user equipment, laptops, digital broadcast terminals, personal digital assistants (PDAs), portable media players (PMPs), browsers, laptops (PC), slate PCs, tablet PCs, ultra-portable computers, portable devices (eg smart watches, smart glasses, helmets (HMD)), and others. As a nonlimiting example only, the remainder of the description will be made with reference to particular types of mobile terminals. However, such teachings apply equally to other types of terminals, such as the types noted above. In addition, these teachings can also be applied to stationary terminals such as digital TV, desktops, and the like. Reference is now made to FIGS. 1A to 1C, where FIG. 1A is a block diagram of a mobile terminal in accordance with the present description, and FIGS. 1B and 1C are conceptual views of an example of the mobile terminal, given from different directions. The mobile terminal 100 is shown having components such as a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180, and an electric power supply unit 190. The implementation of all of the illustrated components is not a necessity, and more or less of the components can be implemented alternatively. The wireless communication unit 110 typically includes one or more modules that enable communications such as wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal, communications between the mobile terminal 100 and an external server.
[0007] In addition, the wireless communication unit 110 typically includes one or more modules that connect the mobile terminal 100 to one or more networks. To facilitate such communications, the wireless communication unit 110 includes one or more of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-distance communication module 114 , and a location information module 115. The input unit 120 includes an image acquisition device 121 for obtaining images or a video, a microphone 122, which is a type of audio input device to input an audio signal, and a user input unit 123 (for example, a touch key, a push button, a mechanical key, a programmable key, and the like) to allow a user to enter information. Data (e.g., audio, video, image, and the like) is obtained by the input unit 120 and can be analyzed and processed by the controller 180 according to device parameters, user instructions, and associations of those -this. The detection unit 140 is typically implemented using one or more sensors configured to detect internal information of the mobile terminal, the near-mobile environment, user information, and the like. For example, in FIG. 1A, the detection unit 140 is shown having a proximity sensor 141 and a lighting sensor 142. If desired, the detection unit 140 may alternatively or additionally include d other types of sensors or devices, such as a touch sensor, an acceleration sensor, a magnetic sensor, a gravity sensor, a gyro sensor, a motion sensor, a RGB sensor, an infrared (IR) sensor , a digital scanning sensor, an ultrasonic sensor, an optical sensor (for example, the image acquisition device 121), a microphone 122, a battery meter, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, a thermal sensor, and a gas sensor, among others), and a chemical sensor (e.g., an electronic nose, a health sensor, a biometric sensor, and the like) ), to name a few. The mobile terminal 100 can be configured to use information obtained from the detection unit 140, and in particular, information obtained from one or more sensors of the detection unit 140, and associations of them. Output unit 150 is typically configured to output various types of information, such as audio, video, touch output, and the like. The output unit 150 is shown having a display unit 151, an audio output module 152, a haptic module 153, and an optical output module 154. The display unit 151 may have an inter-layer structure or an integrated structure with a touch sensor to constitute a touch screen. The touch screen may provide an output interface between the mobile terminal 100 and a user, as well as serve as a user input unit 123 which provides an input interface between the mobile terminal 100 and the user. The interface unit 160 serves as an interface with various types of external devices that can be coupled to the mobile terminal 100. The interface unit 160, for example, can include any of wired or wireless ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, input / output ports (1/0 ) audio, video 1/0 ports, headset ports, and the like. In some cases, the mobile terminal 100 can perform matching control functions associated with a connected external device, in response to the device that is connected to the interface unit 160. The memory 170 is typically implemented to store data. to support various functions or features of the mobile terminal 100. For example, the memory 170 may be configured to store application programs executed in the mobile terminal 100, data or instructions for operations of the mobile terminal 100, and the like. Some of these application programs can be downloaded from an external server via wireless communication. Other application programs may be installed in the mobile terminal 100 during manufacturing or shipping, which is typically the case for basic functions of the mobile terminal 100 (for example, receiving a call). , making a call, receiving a message, sending a message, and the like). It is common for application programs to be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform an operation (or function) for the mobile terminal 100. The controller 180 typically serves controlling the overall operation of the mobile terminal 100, in addition to the operations associated with the application programs.
[0008] The controller 180 can provide or process information or functions appropriate for a user by processing signals, data, information and the like, which are inputted or outputted by the various components illustrated in FIG. 1A, or by activating programs. As an example, the controller 180 controls some or all of the components illustrated in FIGS. 1A-1C according to the execution of application programs that have been stored in the memory 170. electric power supply unit 190 may be configured to receive external electrical power or to provide internal electrical energy to provide appropriate electrical power required to operate components and components included in the system. mobile terminal 100. The power supply unit 190 may include a battery, and the battery may be configured to be incorporated in the terminal body, or configured to be detachable from the terminal body. At least some of the above components may operate cooperatively to implement an operation or method of controlling a goggle type terminal according to various embodiments explained hereinafter. The operation or the control method of the glasses-type terminal can be implemented on the glasses-type terminal by controlling at least one application program stored in the memory 170.
[0009] Still referring to Figure 1A, various components illustrated in this figure will now be described in more detail. With respect to the wireless communication unit 110, the broadcast receiving module 111 is typically configured to receive a broadcast signal and / or broadcast related information from an external broadcast management entity by the broadcast unit. intermediate of a broadcast channel. The broadcast channel may include a satellite channel, a terrestrial channel, or both. In some embodiments, at least two broadcast receiving modules 111 may be used to facilitate simultaneous reception of at least two broadcast channels, or to support switching among broadcast channels. The mobile communication module 112 may transmit and / or receive one or more wireless signals and one or more network entities. Typical examples of a network entity include a base station, a mobile external terminal, a server, and the like. Such network entities are part of a mobile communication network, which is built according to technical standards or communication methods for mobile communications (eg global mobile communication system (GSM), distributed multiple access Code-Division Multiple Access (CDMA) 2000 (CDMA 2000), EV-DO, Broadband Code Division Multiple Access (WCDMA), High-Speed Downlink Packet Access (HSDPA), Packet Access high-throughput uplink (HSUPA), long-term evolution (LTE) technology, long-term advanced-evolution (LTE-A) technology, and the like).
[0010] Examples of wireless signals transmitted and / or received via the mobile communication module 112 include audio calling signals, video calling (telephony) signals, or various data formats for supporting the communication of messages. text and multimedia. The wireless Internet module 113 is configured to facilitate wireless Internet access. This module can be internally or externally coupled to the mobile terminal 100. The wireless Internet module 113 can transmit and / or receive wireless signals over communication networks using wireless Internet technologies. Examples of such wireless internet access include wireless local area network (WLAN), Wi-Fi, Wi-Fi Direct, DLNA, WiBro, WiMax, HSDPA, HSUPA, LTE, LTE-A, and the like. The wireless Internet module 113 may transmit / receive data according to one or more of such wireless Internet technologies, and other Internet technologies. In some embodiments, when the wireless Internet access is implemented according to, for example, WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A and the like, as part of a mobile communication network, the wireless Internet module 113 makes such wireless Internet access. The Internet module 113 may cooperate with or serve as a mobile communication module 112. The short-distance communication module 114 is configured to facilitate short-distance communications. Suitable technologies for implementing such short-haul communications include BLUETOOTHTm, Radio Frequency Identification (RFID), IrDA, Ultra Wide Band (UWB), ZigBee, Near Field Communication (NFC), Wi-Fi, Wi-Fi Direct. , universal serial bus (USB) wireless, and the like. The short-range communication module 114 generally supports wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal 100, or communications between the mobile terminal. and a network where another mobile terminal 100 (or an external server) is located, via wireless networks. An example of wireless networks is a wireless personal network. In some embodiments, another mobile terminal (which may be configured similarly to the mobile terminal 100) may be a portable device, for example, a smart watch, smart glasses or a hard hat (HMD), which is able to exchange data with the mobile terminal 100 (or otherwise cooperate with the mobile terminal 100). The short-distance communication module 114 can detect or recognize the portable device, and allow communication between the portable device and the mobile terminal 100. In addition, when the detected portable device is a device that is authenticated to communicate with the terminal For example, the controller 180 may cause the processed data to be transmitted in the mobile terminal 100 to the portable device via the short-distance communication module 114. Thus, a user of the portable device can use the data. processed in the mobile terminal 100 on the portable device. For example, when a call is received in the mobile terminal 100, the user can answer the call using the portable device. Also, when a message is received in the mobile terminal 100, the user can view the received message using the portable device. The location information module 115 is generally configured to detect, calculate, derive or otherwise identify a position of the mobile terminal. For example, the location information module 115 includes a GPS module, a Wi-Fi module, or both. If desired, the location information module 115 may alternatively or additionally operate with any of the other modules of the wireless communication unit 110 to obtain data relating to the position of the mobile terminal. For example, when the mobile terminal uses a GPS module, a position of the mobile terminal can be acquired using a signal sent from a GPS satellite. As another example, when the mobile terminal uses the Wi-Fi module, a position of the mobile terminal can be acquired based on information relating to a wireless access point (AP) which transmits or receives a signal wirelessly to or from the Wi-Fi module. The input unit 120 may be configured to allow various types of input to the mobile terminal 120. Examples of such an input include audio, picture, video, data, and user. The image and video input is often obtained by using one or more image acquisition devices 121. Such image acquisition devices 121 can process image frames of inanimate images or video obtained by images. image sensors in a video mode or image capture. The processed image frames may be displayed on the display unit 151 or stored in the memory 170. In some cases, the image acquisition devices 121 may be arranged in a matrix configuration to allow a plurality of images. images having various angles or focal points to be input to the mobile terminal 100. As another example, the image acquisition devices 121 may be located in a stereoscopic arrangement to acquire left and right images for implement a stereoscopic image. The microphone 122 is generally implemented to allow audio input into the mobile terminal 100. The audio input may be processed in a variety of ways according to a function performed in the mobile terminal 100. If desired, the microphone 122 may include matching noise elimination algorithms to eliminate unwanted noise generated during reception of external audio. The user input unit 123 is a component that allows input by a user. Such user input may enable the controller 180 to control the operation of the mobile terminal 100. The user input unit 123 may include one or more of a mechanical input element (for example, a key, a button located on a front and / or rear surface or a side surface of the mobile terminal 100, a curved switch, a pulse wheel, a pulse switch, and the like), or a touch input, among others. As an example, the touch input may be a virtual key or soft key, which is displayed on a touch screen via software processing, or a touch key that is located on the mobile terminal at a location that is other than the touch screen. On the other hand, the virtual key or the visual key can be displayed on the touch screen in various forms, for example, a graphic, a text, an icon, a video, or an association thereof. The detection unit 140 is generally configured to detect one or more of internal information of the mobile terminal, environment information close to the mobile terminal, user information, or the like. The controller 180 generally cooperates with the detection unit 140 to control the operation of the mobile terminal 100 or to execute a data processing, function or operation associated with an application program installed in the mobile terminal according to the detection provided. by the detection unit 140. The detection unit 140 may be implemented using any of a variety of sensors, some of which will now be described in more detail. The proximity sensor 141 may include a sensor for detecting the presence or absence of an object approaching a surface, or an object located near a surface, using an electromagnetic field, infrared rays, or the like without mechanical contact. The proximity sensor 141 may be arranged on an inner region of the mobile terminal covered by the touch screen, or near the touch screen. The proximity sensor 141, for example, may include any of a transmitted type photoelectric sensor sif, a specular reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a proximity sensor. capacitive type, a magnetic type proximity sensor, an infrared proximity sensor, and the like. When the touch screen is capacitive type, the proximity sensor 141 can detect the proximity of a pointer to the touch screen by changes in an electromagnetic field, which corresponds to the approximation of an object having a conductivity . In this case, the touch screen (touch sensor) can also be categorized as a proximity sensor. The term "touch proximity" will often be used in this application in correspondence to the scenario in which a pointer is positioned to be close to the touch screen without touching the touch screen. The term "contact touch" will often be used herein to denote the scenario in which a pointer physically enters into contact with the touch screen. For the position corresponding to the touch in proximity of the pointer relative to the touch screen, such a position will correspond to a position where the pointer is perpendicular to the touch screen. Proximity sensor 141 can detect touch in proximity, and touch profiles in proximity (e.g., distance, direction, speed, time, position, motion status, and the like). In general, the controller 180 processes data corresponding to proximity taps and proximity touch profiles detected by the proximity sensor 141, and results in visual information being output to the touch screen. In addition, the controller 180 may control the mobile terminal 100 to perform different operations or process different data depending on whether a touch with respect to a point on the touch screen is a touch in proximity or a touch contact.
[0011] A touch sensor may detect a touch applied to the touch screen, such as the display unit 151, using any of a variety of touch methods. Examples of such tactile methods include a resistive type, a capacitive type, an infrared type, and a magnetic field type, among others. For example, the touch sensor may be configured to convert pressure changes applied to a specific portion of the display unit 151, or to convert a capacitance occurring on a specific portion of the display unit 151. , as electrical input signals. The touch sensor can also be configured to detect not only an affected position and an affected area, but also tactile pressure and / or touch capacitance. A touch object is typically used to apply tactile input to the touch sensor. Examples of typical touch objects include a finger, a touch pen, a stylus, a physical pointer, or the like.
[0012] When a touch input is detected by a touch sensor, corresponding signals can be transmitted to a touch controller. The touch controller can process the received signals, and then transmit corresponding data to the controller 180. Therefore, the controller 180 can detect the area of the display unit 151 that has been touched. Here, the touch controller may be a separate component of the controller 180, the controller 180, and combinations thereof. In some embodiments, the controller 180 may execute the same or different commands depending on a type of touch object that touches the touch screen or a touch key provided in addition to the touch screen. Executing the same command or a different command depending on the object that provides a touch input can be decided based on a current state of operation of the mobile terminal 100 or an application program currently being executed by example. The touch sensor and the proximity sensor can be used individually, in combination, to detect various types of touch. Such touches include a short touch, a long touch, a multiple touch, a touch-sensitive feel, a quick touch, a close-in pinch, a separating pinch, a touch sliding, an immediate proximity touch, and the like. If desired, an ultrasonic sensor may be implemented to recognize positional information relating to a touch object using ultrasonic waves. The controller 180, for example, can calculate a position of a wave generation source based on information detected by a lighting sensor and a plurality of ultrasonic sensors. Since light is much faster than ultrasonic waves, the time during which the light reaches the optical sensor is much shorter than the time during which the ultrasonic wave reaches the ultrasonic sensor. The position of the wave generation source can be calculated using this fact. For example, the position of the wave generation source can be calculated using the time difference from the time at which the ultrasonic wave reaches the sensor as a function of light as a reference signal. The image acquisition device 121 typically includes at least one of an image acquisition device sensor (CCD, CMOS etc.), a photo-sensor (or image sensors), and a laser sensor .
[0013] The implementation of the image acquisition device 121 with a laser sensor can enable the detection of a touch of a physical object with respect to a 3D stereoscopic image. The photo-sensor can be laminated on, or overlapped with, the display device. The photo sensor can be configured to scan a movement of the physical object near the touch screen. In more detail, the photo-sensor may include photodiodes and phototransistors in rows and columns for scanning content received in the photo-sensor using an electrical signal that changes according to the amount of light applied. Namely, the photo-sensor can calculate the coordinates of the physical object according to a variation of light to thereby obtain position information of the physical object. The display unit 151 is generally configured to output processed information in the mobile terminal 100. For example, the display unit 151 may display run screen information of an application program executed in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information in response to the execution screen information. In some embodiments, the display unit 151 may be implemented as a stereoscopic display unit for displaying stereoscopic images.
[0014] A typical stereoscopic display unit may use a stereoscopic display scheme such as a stereoscopic scheme (a glass scheme), an auto-stereoscopic scheme (without glass scheme), a projection scheme (holographic scheme), or the like . The audio output module 152 is generally configured to output audio data. Such audio data may be obtained from any one of a number of different sources, so that the audio data may be received from the wireless communication unit 110 or may have been stored in the memory. The audio data may be output during modes such as a signal receiving mode, a calling mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like. The audio output module 152 may provide an audible output relating to a particular function (e.g., a call signal receiving sound, a message receiving sound, etc.) performed by the mobile terminal 100. The audio output 152 may also be implemented as a receiver, loudspeaker, buzzer, or the like. A haptic module 153 may be configured to generate various tactile effects that a user feels, perceives, or experiences otherwise. A typical example of a tactile effect generated by the haptic module 153 is vibration. The intensity, mode, and the like of the vibration generated by the haptic module 153 may be controlled by user selection or controller setting. For example, the haptic module 153 may output different vibrations in a combinatorial or sequential manner.
[0015] In addition to the vibration, the haptic module 153 may generate various other tactile effects, including a pacing effect such as a vertically movable hair pin arrangement for contacting the skin, a spraying force or suction force. air through a jet orifice or suction aperture, a touch of the skin, an electrode contact, an electrostatic force, an effect by reproducing the sensation of cold and heat by using an element that can absorb or generate heat, and the like. The haptic module 153 can also be implemented to allow the user to feel a tactile effect via a muscular sensation, for example the fingers or the arm of the user, as well as the transfer of the effect. tactile by direct contact. Two or more haptic modules 153 may be provided according to the particular configuration of the mobile terminal 100. An optical output module 154 may output a signal to indicate event generation using light from a light source. Examples of events generated in the mobile terminal 100 may include message reception, call waiting, a missed call, an alarm, a calendar notification, an email reception, a reception of information via an application, and the like. A signal outputted from the optical output module 154 may be implemented such that the mobile terminal transmits monochrome light or light with a plurality of colors. The output signal can be terminated when the mobile terminal detects that a user has conseated the generated event, for example. The interface unit 160 serves as an interface for external devices to be connected to the mobile terminal 100. For example, the interface unit 160 may receive data transmitted from an external device, receive data from the electrical energy to be transferred to elements or components within the mobile terminal 100, or transmit internal data of the mobile terminal 100 to such an external device. The interface unit 160 may include wired or wireless headset ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connect a device having an identification module, audio input / output (1/0) ports, video 1/0 ports, headset ports, or the like. The identification module may be a chip that stores various information to authenticate the entity using the mobile terminal 100 and may include a user identity module (UIM), a subscriber identity module (SIM), a universal subscriber identity module (USIM), and the like. In addition, the device having the identification module (also referred to herein as "identification device") may take the form of a smart card.
[0016] Therefore, the identification device can be connected with the terminal 100 via the interface unit 160. When the mobile terminal 100 is connected with an external docking station, the interface unit 160 may be used as a passageway to allow electrical energy from the docking station to be supplied to the mobile terminal 100 or may be used as a gateway to allow various order signals entered by the user from the docking station to be transferred to the mobile terminal through it. Various signals with instructions or electrical input from the docking station may serve as signals to recognize that the mobile terminal is properly mounted on the docking station.
[0017] The memory 170 can store programs to support operations of the controller 180 and store input / output data (eg, phone book, messages, inanimate images, videos, etc.). The memory 170 can store data relating to various vibration and audio profiles that are output in response to touch inputs on the touch screen.
[0018] The memory 170 may include one or more types of storage media including a flash memory, a hard disk, a semiconductor disk, a silicon disk, a micro-type of multimedia card, a card-type memory (for example: example, SD or DX memory, etc.), random access memory (RAM), static random access memory (SRAM), read only memory (ROM), erasable and electrically programmable read only memory (EEPROM), programmable read only memory (PROM) ), a magnetic memory, a magnetic disk, an optical disk, and the like. The mobile terminal 100 may also be operated with respect to a network storage device which performs the storage function of the memory 170 via a network, such as the Internet. The controller 180 can typically control the general operations of the mobile terminal 100. For example, the controller 180 can set or release a lock state to prevent a user from entering a control command with respect to applications when a state of the mobile terminal meets a pre-set condition. The controller 180 may also perform control and processing associated with voice calls, data communications, video calls, and the like, or perform pattern recognition processing to recognize a handwriting input or a drawing input made on the touch screen as characters or images, respectively. In addition, the controller 180 may control a component or combination of these components to implement various illustrative embodiments described in this application. The electric power supply unit 190 receives external electrical power or provides internal electrical power and provides the appropriate electrical power supply required to operate respective elements and components included in the mobile terminal 100. The power supply unit 190 may include a battery, which is typically rechargeable or detachably coupled to the terminal body for charging.
[0019] The power supply unit 190 may include a connection port. The connection port may be configured as an example of the interface unit 160 to which an external charger is electrically connected to provide power to recharge the battery. As another example, the power supply unit 190 may be configured to recharge the wireless battery without using the connection port. In the present example, the electric power supply unit 190 can receive electrical energy, transferred from an external wireless electrical energy transmitter, using at least one of an inductive coupling method which is based on a magnetic induction or a magnetic resonance coupling process that is based on electromagnetic resonance. Various embodiments described herein may be implemented in a computer readable medium, a machine readable medium, or similar medium using, for example, software, hardware, or any combination thereof. Referring now to FIGS. 1B and 1C, the mobile terminal 100 is described with reference to a bar-type terminal body. However, the mobile terminal 100 may be implemented alternatively in any of a variety of different configurations. Examples of such configurations include a watch type, bar type, eyeglass type, or foldable type, flap type, sliding type, toggle type, and swivel type in which two and more bodies are associated with each other in a relatively mobile manner, and associations thereof. The description herein will often refer to a particular type of mobile terminal (eg, a bar type, a watch type, a goggle type, and the like). However, such teachings with respect to a particular type of mobile terminal will generally apply to other types of mobile terminals as well. The mobile terminal 100 will generally include a housing (eg, frame, housing, cover, and the like) forming the appearance of the terminal. In this embodiment, the housing is formed using a front housing 101 and a rear housing 102. Various electronic components are incorporated in a gap formed between the front housing 101 and the rear housing 102. At least one middle housing can further positioned between the front housing 101 and the rear housing 102.
[0020] The display unit 151 is shown located on the front side of the terminal body to output information. As illustrated, a window 151a of the display unit 151 may be mounted on the front housing 101 to form the front surface of the terminal body together with the front housing 101. In some embodiments, electronic components may Also, examples of such electronic components include a detachable battery 191, an identification module, a memory card, and the like. A rear cover 103 is shown covering the electronic components, and this cover can be releasably coupled to the rear housing 102. Thus, when the rear cover 103 is removed from the rear housing 102, the electronic components mounted on the rear housing 102 are exposed. externally. As illustrated, when the rear cover 103 is coupled to the rear housing 102, a side surface of the rear housing 102 is partially exposed. In some cases, when mating, the rear housing 102 may also be completely protected by the rear cover 103. In some embodiments, the rear cover 103 may include an opening for an image acquisition device. 121b or an audio output module 152b is disposed outside. The housings 101, 102, 103 may be formed by injection molding a synthetic resin or may be formed of a metal, for example, stainless steel (STS), aluminum (Al), titanium (Ti), or the like. Instead of the example in which the plurality of housings form an interior space for housing components, the mobile terminal 100 may be configured such that a housing forms the interior space. In the present example, a mobile terminal 100 having a uni-body is formed such that a synthetic resin or metal extends from a side surface to a back surface. If desired, the mobile terminal 100 may include a water seal unit (not shown) to prevent the introduction of water into the terminal body. For example, the water seal unit may include a water seal element that is located between the window 151a and the front housing 101, between the front housing 101 and the rear housing 102, or between the rear housing 102 and the rear cover 103, for sealing an interior space when these housings are coupled.
[0021] The mobile terminal 100 may include the display unit 151, the first audio output module 152a, the second audio output module 152b, the proximity sensor 141, the illumination sensor 142, the optical output module 154, the first image acquisition device 121a, the second image acquisition device 121b, the first manipulation unit 123a, the second manipulation unit 123b, the microphone 122, the interface unit 160, etc. The mobile terminal 100 is described below with reference to FIGS. 1B and 1B. The display unit 151, the first audio output module 152a, the proximity sensor 141, the illumination sensor 142, the optical output module 154, the first image acquisition device 121a and the first unit 123a are arranged on the front surface of the terminal body. The second handling unit 123b, the microphone 122 and the interface unit 160 are arranged on the side surfaces of the terminal body. The second audio output module 152b and the second image acquisition device 121b are arranged on the rear surface of the terminal body. However, the distribution of the components may not be limited to this arrangement. Some components may be omitted or rearranged. For example, the first handling unit 123a may be located on another surface of the terminal body, and the second audio output module 152b may be located on the side surface of the terminal body. The display unit 151 outputs information processed in the mobile terminal 100. For example, the display unit 151 may display information on an execution screen of an application program controlled in the mobile terminal 100 , or a user interface (UI) or graphical user interface (GUI) related to such execution screen information. The display unit 151 may be implemented using one or more suitable display devices. Examples of such suitable display devices include a liquid crystal display (LCD) screen, a thin film transistor (TFT-LCD) liquid crystal display, an organic light-emitting diode (OLED), a flexible screen, a three-dimensional (3D) screen, an electronic ink screen, and associations thereof. The display unit 151 can be implemented using two display devices, which can implement the same technology or a different display technology. For example, a plurality of display units 151 may be arranged on one side, spaced from each other, or these devices may be integrated, or these devices may be arranged on different surfaces. The display unit 151 may also include a touch sensor that detects a touch input received on the display unit. When a touch is input to the display unit 151, the touch sensor can be configured to detect that touch and the controller 180, for example, can generate a control command or other signal corresponding to the touch. Content that is touched can be text or a numeric value, or a menu item that can be embedded or designated in various modes. The touch sensor may be configured as a film having a touch profile, disposed between the window 151a and a display screen on a rear surface of the window 151a, or a wire which is patterned directly on the back surface of the window. the window 151a. Alternatively, the touch sensor may be formed integrally with the display screen. For example, the touch sensor may be disposed on a substrate of the display screen or within the display screen. The display unit 151 may also form a touch screen together with the touch sensor. Here, the touch screen can serve as a user input unit 123 (see Figure 1A). Thus, the touch screen can replace at least some of the functions of the first handling unit 123a. The first audio output unit 152a may be implemented as a receiver for transmitting a call sound to the ears of a user, and the second audio output unit 152b may be implemented as a speaker for exit each type of alarm sounds or a multimedia playback sound. It can be configured so that sounds generated from the first audio output module 152a are scattered along an assembly gap between the structural bodies (for example, between the window 151a and the front housing 101). In this case, an independently formed hole for outputting audio sounds may not be seen or may be disguised in appearance, thus purifying the appearance of the mobile terminal 100. The optical output module 154 may broadcast a light signal for indicate an event generation. Examples of events generated in the mobile terminal 100 may include message reception, call waiting, a missed call, an alarm, a calendar notification, an email reception, a reception of information via an application, and the like. A signal outputted from the optical output module 154 may be implemented such that the mobile terminal transmits monochrome light or light with a plurality of colors. When a user has viewed a generated event, for example, this light signal broadcast is stopped. The first image acquisition device 121a processes image data of inanimate or video images acquired by an image capture device in a video capture mode or an image capture mode. The processed image frames can be displayed on the display unit 151, or can be stored in the memory 170. The first and second handling units 123a and 123b are examples of the user input unit 123, which can be manipulated by user to provide input to the mobile terminal 100. The first and second handling units 123a and 123b may also be commonly referred to as manipulative part, and may utilize any tactile method that allows the user to realize manipulation such as touching, pushing, scrolling, or the like. The first and second handling units 123a and 123b may be implemented in a non-tactile manner for the user, for example by a touch in proximity, a touch in close proximity, etc. Figure 1B illustrates the first handling unit 123a as a touch key, but other possibilities include a mechanical key, a push button, a touch key, and combinations thereof. An input received in the first and second handling units 123a and 123b can be used in various ways. For example, the first handling unit 123a can be used by the user to provide menu entry, a home key, a cancel, a search, or the like, and the second handling unit 123b can be used by the user for providing an input for controlling a volume level produced from the first or second audio output module 152a or 152b, to switch to a touch recognition mode of the display unit 151, or the like. As another example of the user input unit 123, a rear input unit (not shown) may be located on the rear surface of the terminal body. The rear input unit may be manipulated by a user to provide an input to the mobile terminal 100. The input may be used in a variety of different ways. For example, the rear input unit may be used by the user to provide an input to turn on, turn off, start, end, scroll, control the volume level produced from the first or second audio output module 152a or 152b, switch to a touch recognition mode of the display unit 151, and the like. The rear input unit may be configured to allow touch input, push input, or combinations thereof. The rear input unit may be located to overlap the display unit 151 on the front side in a thickness direction of the terminal body. For example, the rear input unit may be located on a rear end portion of the rear side of the terminal body so that a user can easily manipulate it using the index when the user grasps the terminal body with a hand. Alternatively, the rear input unit may be positioned at virtually any location on the rear side of the terminal body.
[0022] Embodiments that include the rear input unit may implement some or all of the functionality of the first handling unit 123a in the rear input unit. As such, in situations where the first handling unit 123a is omitted from the front side, the display unit 151 may have a larger screen. As an additional possibility, the mobile terminal 100 may include a digital scanning sensor that scans a user's fingerprint. The controller 180 can then use fingerprint information detected by the digital scan sensor as part of an authentication procedure. The digital scanning sensor may also be installed in the display unit 151 or implemented in the user input unit 123. The microphone 122 is shown at one end of the mobile terminal 100, but other locations are possible. If desired, multiple microphones can be implemented with such an arrangement for receiving stereo sounds.
[0023] The interface unit 160 may serve as a path for the mobile terminal 100 to interface with external devices. For example, the interface unit 160 may include one or more of a connection terminal for connection to another device (e.g., a headset, an external speaker, or the like), a port for in-field communication near (for example, an IrDA port, a Bluetooth port, a wireless LAN port, and the like), or an electrical power supply terminal for supplying power to the mobile terminal 100. The unit interface 160 can be implemented in the form of a connector for housing an external card, such as a subscriber identification module (SIM), a user identification module (UIM), or a memory card for information storage. The second image acquisition device 121b is shown located on the rear side of the terminal body and includes an image capture direction which is substantially opposite to the image capture direction of the first device unit. image acquisition 121a. If desired, the second image acquisition device 121a may alternatively be located at other locations, or made to be movable, to have an image capture direction different from that shown.
[0024] The second image acquisition device 121b may include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix configuration. The image acquisition devices may be called "network image acquisition device". When the second image acquisition device 121b is implemented as a network image acquisition device, images can be captured in various ways using the plurality of lenses and images with better qualities. As shown in Fig. 1C, a flash 124 is shown adjacent to the second image acquisition device 121b. When an image of a subject is captured with the image acquisition device 121b, the flash 124 may illuminate the subject. As shown in FIG. 1B, the second audio output module 152b may be located on the terminal body. The second audio output module 152b may implement stereophonic sound functions together with the first audio output module 152a, and may also be used to implement a speakerphone mode for call communication. At least one antenna for wireless communication may be located on the terminal body. The antenna can be installed in the terminal body or formed by the housing. For example, an antenna that configures a portion of the broadcast receiving module 111 (refer to Figure 1A) may be retractable into the terminal body. Alternatively, an antenna may be formed using a film attached to an inner surface of the back cover 103, or a housing that includes a conductive material.
[0025] An electric power supply unit 190 for providing power to the mobile terminal 100 may include a battery 191, which is mounted in the terminal body or detachably coupled to an outside of the terminal body. The battery 191 can receive electrical energy via an electrical power source cable connected to the interface unit 160. Also, the battery 191 can be wirelessly recharged using a wireless charger. . The wireless charge can be implemented by magnetic induction or electromagnetic resonance. The rear cover 103 is shown coupled to the rear housing 102 to protect the battery 191, to prevent separation of the battery 191, and to protect the battery 191 from external impact or foreign matter. When the battery 191 is detachable from the terminal body, the rear housing 103 may be detachably coupled to the rear housing 102. An accessory to protect an appearance or assist or increase the functions of the mobile terminal 100 may also be provided on the mobile terminal. As an example of an accessory, a cover or cover for covering or housing at least one surface of the mobile terminal 100 may be provided. The cover or the pocket can cooperate with the display unit 151 to increase the function of the mobile terminal 100. Another example of the accessory is a touch pen to assist or increase a touch input on a touch screen. According to still further embodiments, a mobile terminal can also be understood as a portable device on a human body. Such devices go beyond the usual technique of a user entering the mobile terminal using his hand. Examples of the portable device include a smart watch, smart glasses, a helmet (HMD), and the like. A typical portable device can exchange data with (or cooperate with) another mobile terminal 100. In such a device, the portable device generally has fewer features than the cooperating mobile terminal. For example, the short-range communication module 114 of a mobile terminal 100 can detect or recognize a portable device that is close enough to communicate with the mobile terminal. In addition, when the detected portable device is a device that is authenticated to communicate with the mobile terminal 100, the control unit 180 can transmit processed data in the mobile terminal 100 to the portable device via the communication module. short distance 114, for example. Thus, a user of the portable device can use the processed data in the mobile terminal 100 on the portable device. For example, when a call is received in the mobile terminal 100, the user can answer the call using the portable device. Also, when a message is received in the mobile terminal 100, the user can control the received message using the portable device. FIG. 2 is a perspective view illustrating an example of a glasses type mobile terminal 400 according to another illustrative embodiment. The glasses-type mobile terminal 400 can be portable on a head of a human body and provided with a mount (housing, housing, etc.) associated. The frame can be made of a flexible material to be easily worn. The mobile terminal mount 400 is shown having a first mount 401 and a second mount 402, which can be made of the same material or different materials. In general, the mobile terminal 400 may be configured to include features that are the same or similar to those of the mobile terminal 100 of Figs. 1A-1C. The mount can be supported on the head and defines a space for mounting various components. As illustrated, electronic components, such as a control module 480, an audio output module 452, and the like, may be mounted on the mounting portion. Also, a glass 403 to cover either or both of the left and right eyes can be detachably coupled to the mount part.
[0026] The control module 480 controls various electronic components disposed in the mobile terminal 400. It is understood that the control module 480 may be a component corresponding to the aforementioned controller 180. FIG. 4 shows that the control module 480 is installed in the mount on one side of the head, but other locations are possible.
[0027] The display unit 451 can be implemented as a headset (HMD). HMD refers to display techniques by which a display is mounted on a head to present an image directly in front of a user's eyes. In order to provide an image directly in front of the eyes of the user when the user is wearing the glasses-type mobile terminal 400, the display unit 451 may be located to correspond to one or the other or both left and right eyes. Figure 4 shows that the display unit 451 is located on a portion corresponding to the right eye to output an image that can be seen by the right eye of the user. The display unit 451 can project an image into the user's eye using a prism. Also, the prism can be formed from optically transparent material so that the user can view both the projected image and a general field of view (directly visualized by the user's eyes) in front of the user. In this way, the image output through the display unit 451 can be seen while overlapping the general field of view. The mobile terminal 400 can provide augmented reality (AR) by overlapping a virtual image on an actual image or background using the display screen.
[0028] The image acquisition device 421 may be located adjacent one or the other or both of the left and right eyes to capture an image. Since the image acquisition device 421 is located adjacent to the eye, the image acquisition device 421 can acquire a scene that the user is currently viewing.
[0029] The image acquisition device 421 can be positioned at a virtually any location of the mobile terminal. In some embodiments, multiple image acquisition devices 421 may be used. Such multiple image acquisition devices 421 can be used to acquire a stereoscopic image.
[0030] The internal image acquisition device 422 may be arranged to be adjacent to at least one of the user's left eye or right eye and may face the inside of the lens 403 to capture a image of the user's eyes. In detail, the internal image acquisition device 422 can capture an image of the user's eye blink, pupil movement, eye size, and the like. The eyeglass type mobile terminal 400 may include user input units 423a and 423b, each of which may be manipulated by the user to provide an input. User input units 423a and 423b may employ techniques that allow input via a touch input. Typical touch inputs include a touch, a push, or the like. The user input units 423a and 423b are shown for use by pushing or touching as they are located on the mounting portion and the control module 480, respectively. If desired, the mobile terminal 400 may include a microphone that processes sound input to electrical audio, and an audio output module 452 for outputting an audio signal. The audio output module 452 may be configured to produce an audio signal by general audio output or osteoconduction. When the audio output module 452 is implemented by osteoconduction, the audio output module 452 can be in contact with the head when the user is wearing the mobile terminal 400 and vibrate the user's skull to transfer sounds. Hereinafter, embodiments relating to a control method that can be implemented in the mobile terminal configured as described above will be described with reference to the accompanying drawings. It is obvious to those skilled in the art that modification, change, and improvement can be made to what is described below without departing from the nature and general idea of the present invention. . Fig. 3 is a flowchart illustrating a method of operating a goggle-type terminal 400 (refer to Fig. 2) according to an illustrative embodiment of the present disclosure. The glasses-type terminal 400 includes a mount unit 401, mount units 401 and 402, a display unit 451, and a controller 480. Referring to Fig. 3, first, preset conditions are detected. in step S310. In detail, preset conditions may be detected based on user input, user input, user body information, and the like. Bodily information can refer to complete information derived from the user's body, such as a user's gaze, pupil size, eye size, blink count, movement of the user's head, a period during which the user is turned in a particular direction, the fact that the user is looking at a particular subject, and the like. In one illustrative embodiment, a case in which the user touches the mount 401 or 402, a case in which the pupil size of the user increases, a case in which the user is rotated in a fixed direction during a long time, a case in which the user's cheek comes into contact with the 403 glass, a case in which a user's smile is detected on the user's facial muscle, a case in which the user's cheek user turns his head suddenly, a case in which the user's eyes are not turned forward, a case in which the user detects a particular subject, a case in which the user inclines his head, and the like , can be set as conditions in advance. In another illustrative embodiment, a case in which the voice of a user, such as "Oops! ", " What is that "," It looks delicious, "" Wow, "and the like, is entered can be set as conditions in advance. Subsequently, when preset conditions are detected, the electrical power of the image acquisition device 421 is turned on in step S320. When the preset conditions are not detected, the method is returned to step S310 to detect the conditions again.
[0031] In particular, only when the preset conditions are detected, the electrical power of the image acquisition device 421 is turned on, thus reducing the consumption of the battery of the glasses-type terminal 400. Also, a heating problem which can occur in the glasses-type terminal 400 can be solved.
[0032] A movement of the user's pupil is detected to detect a subject in step S322. In detail, a movement of the user's pupil can be detected by the internal image acquisition device 422 which is arranged to be adjacent to at least one of the left eye and the right eye of the user and faces the eyes of the user. Also, a subject can be detected using the detected motion of the user's pupil. For example, in a case in which the pupil of the user moves in line with a particular object, the particular object can be detected by the subject whose user wishes to capture the image. After that, the image captured by the image acquisition device 421 is analyzed to obtain image capture guide information in step S330. In detail, composition, brightness, degree of vibration, states of the user and subject, and the like are analyzed according to the image captured by the image acquisition device 421 in a lit state. , and image capture guide information providing an optimized image capture environment is produced. After that, the image capture guide information produced is output in step S340. In detail, the image capture guide information produced can be provided, in the form of visual information, to the user, through the display unit 451. In another illustrative embodiment, the image capture guide information can be provided, in the form of voice information, to the user, via the audio output module 452. Thereafter, when a user command order which follows the image capture guide information outputs is input, the image acquisition device 421 is controlled to capture an image according to the image capture guide information outputs and the storage of the captured image is controlled in step S350. In an illustrative embodiment, the user may enter a command to capture an image according to the image capture guide information by touching the user input unit 423a or 423b. In another illustrative embodiment, the user may enter a control command to capture an image according to the voice image capture guide information via a microphone. Hereinafter, illustrative embodiments of the glasses-type terminal 400 to which image-capture guide information is provided will be described with reference to Figs. 4 to 11. Figs. 4 to 11 illustrate scenes viewed by the user through the glass 403 of the glasses type terminal 400.
[0033] Fig. 4 is a conceptual view illustrating an illustrative embodiment in which the use of a flash is provided as image capture guide information. Referring to the first drawing of Fig. 4, when it is detected that the motion or behavior of a particular user satisfies preset conditions, the electrical power of the image acquisition device 421 can be turned on and an image 410 can be captured. Here, a plurality of images can be captured to accurately analyze a current state of image capture. For example, when a voice, such as "Ah", "Oh", and the like, is entered by the user, the electrical power of the image acquisition device 421 can be automatically turned on and the image 410 is captured. Here, the image acquisition device 421 can detect a movement of the user's pupil to detect a subject, and capture the image 410 so that the sensed subject is included. Referring to the second drawing of Fig. 4, when it is determined that the brightness of image 410 captured in the first drawing of Fig. 4 is dark, image capture guide information 420 asking whether must use a flash can be output. Here, a voice asking whether to use the flash can be output together. Referring to the third drawing of Fig. 4, when the user enters a control command to use a flash, an image 430 is captured using the flash. For example, when the user makes a nod gesture of his head, the image acquisition device 421 can capture the image 430 again using the flash. Fig. 5 is a conceptual view illustrating an illustrative embodiment wherein the use of an anti-vibration mode is provided as image capture guide information. With reference to the first drawing of Fig. 5, when it is detected that the motion or behavior of a particular user meets preset conditions, the electrical power of the image acquisition device 421 can be turned on. and an image 510 can be captured. Here, a plurality of images can be captured to accurately analyze the current state of image capture.
[0034] For example, when a touch input by the user taping on the frame of the smart glasses 400 is detected, the electrical power of the image acquisition device 421 is automatically turned on and the image 510 can be captured. Here, the image acquisition device 421 can detect a movement of the user's pupil to detect a subject and capture the image 510 to include the detected subject. Referring to the second drawing of Fig. 5, when it is determined that the image 510 captured in the first drawing of Fig. 5 has vibrated, image capture guide information 520 asking whether to use a Anti-vibration mode can be output. Here, a voice asking whether to use the anti-vibration mode can be output together. In another illustrative embodiment, image capture guide information asking whether to capture an image again without motion or vibration can be output.
[0035] Referring to the third drawing of Fig. 5, when the user enters a control command to use the anti-vibration mode, the image 530 is captured using the anti-vibration mode. For example, when the user responds "Yes," image 530 can be captured again in the anti-vibration mode state.
[0036] Fig. 6 is a conceptual view illustrating an illustrative embodiment in which the use of a horizontal composition is provided as image capture guide information. Referring to the first drawing of Fig. 6, when it is detected that the motion or behavior of a particular user satisfies preset conditions, the electrical power of the image acquisition device 421 can be turned on and an image 610 can be captured. For example, in a case in which the user is staring in a fixed direction, the electrical power of the image acquisition device 421 is automatically turned on and an image 610 can be captured. Here, the image acquisition device 421 can detect a movement of the user's pupil to detect a subject and capture the image 610 to include the detected subject. Referring to the second drawing of Fig. 6, when it is determined that a composition of the captured image 610 in the first drawing of Fig. 6 tilts, image capture guide information 620 requesting you have to capture an image in a horizontal composition can be output. Here, a voice proposing to capture an image in the horizontal composition can be output together.
[0037] In another illustrative embodiment, image capture guide information providing a different image capturing composition, such as a triangular composition, or the like, may be output. Referring to the third drawing of Fig. 6, when the user enters a control command to capture an image with the horizontal composition, then image capture guide information 630 can be output again. For example, in a case in which the user nods or responds "Yes", the image capture guide information 630 requesting that the user's head be lowered to the right so that The horizontal composition can be removed. Also, for this purpose, a horizontal line 640, as a reference, can be output together. Referring to the fourth drawing of Fig. 6, when the user lowers his head to the right so that the ground is identical to the horizontal line 640, as a reference, a message 650 indicates that the horizontal fit has been completed can be released. Also, when the ground is identical to the horizontal line 640 as a reference, the image acquisition device 421 automatically captures an image 660. Figures 7 and 8 are conceptual views illustrating illustrative embodiments in which the use of a central composition is proposed as image capture guide information. Referring to the first drawing of Fig. 7, when it is detected that the motion or behavior of a particular user meets preset conditions, the electrical power of the image acquisition device 421 is turned on and an image 710 can be captured. For example, as the user's eye size increases, the electrical power of the image acquisition device 421 is automatically turned on and the image 710 is captured. Here, the image acquisition device 521 can detect a movement of the user's pupil to detect a subject 712, and capture the image 710 to include the detected subject 712. Referring to the second drawing of the figure 7, when it is determined that the subject 712 is not positioned at the center of the captured image 710, image capture guide information 720 providing image capture by placing the subject 712 in the center can be exits. Here, a voice of contents proposing an image capture with the subject 712 placed in the center can be released jointly. Referring to the third drawing of Fig. 7, when the user taps on the frame of the smart glasses 400, image capture guide information 730 requesting leftward movement can be outputted. Also, for this purpose, a reference position 740 can be output together. Referring to the fourth drawing of Fig. 7, when the user moves to the left and the subject 712 is disposed in the reference position 740, a message 750 indicating that the subject 712 is centrally disposed can be outputted. Also, when the subject 712 is disposed in the reference position 740, the image acquisition device 421 automatically captures an image 760. In another illustrative embodiment with reference to the first drawing of Fig. 8, when it is detected that the motion or behavior of a particular user satisfies preset conditions, the electrical power of the image acquisition device 421 is turned on and an image 810 can be captured. For example, in a case in which a change of the user's facial muscles is detected and the user smiles, the electrical power of the image acquisition device 421 can be turned on and the image 810 can be captured . Here, the image acquisition device 421 can detect a movement of the user's pupil to detect a subject 812 and capture the image 810 to include the sensed subject 812. Referring to the second drawing of Figure 8 when it is determined that the subject 812 in the first drawing of Fig. 8 is not positioned in the center of the captured image 810, image capture guide information 820 proposing the rotation of the acquisition device 421 can be output. Here, a voice relating to the proposed rotation of the image acquisition device 421 can also be output together. Referring to the third drawing of Figure 8, when the user enters a control command to turn the image acquisition device 421, an image 830 to be captured by turning the image acquisition device 421 can be proposed in advance. For example, when the user slowly blinks, the image 830 to be captured by turning the image acquisition device 421 can be provided in advance.
[0038] In detail, the image 830 intended to be captured by turning the image acquisition device 421 can be output in overlap on the image 850 seen by the user through the glass 403 of the glasses type terminal 400. Also a message 840 indicating that the subject 812 is centrally disposed according to the rotation of the image acquisition device 421 can be outputted together.
[0039] Referring to the fourth drawing of Fig. 8, when the subject 812 is centrally disposed according to the rotation of the image acquisition device 421, the image acquisition device 421 automatically captures the corresponding image 830. Also a message 860 indicating that the image capture is complete can be outputted.
[0040] When motion of the subject is detected from the image captured by the image acquisition device 421, eye movement following the movement of the subject can be output as image capture guide information. . Also, when it is detected from the image captured by the image acquisition device 421 that the subject is moving, the capture of an image continuously according to a movement of the subject by the acquisition device 421 or video capture can be output as image capture guide information. Figs. 9 and 10 are conceptual views illustrating illustrative embodiments wherein the response of a user according to a movement of a subject is provided as image capture guide information. Referring to the first drawing of Fig. 9, when it is detected that the motion or behavior of a particular user satisfies preset conditions, the electrical power of the image acquisition device 421 can be turned on and an image 910 can be captured. For example, when the user, looking in one direction, suddenly turns his head in a different direction, the electrical power of the image acquisition device 421 can be turned on and the image 910 can be captured. Here, the image acquisition device 421 can detect a movement of the user's pupil to detect a subject 912 and capture the image 910 to include the detected subject 912. Referring to the second drawing of Figure 9 when it is determined that the subject 912 moves in the first drawing of Fig. 9, image capture guide information 920 providing eye movement of the user can be outputted. Here, a voice proposing the movement of the eyes of the user can also be output together. Referring to the third drawing of Fig. 9, when the user's eyes move to the right and the subject 912 is positioned at the center 930 of a viewfinder, the image acquisition device 421 automatically captures an image 940. In another illustrative embodiment, with reference to the first drawing of FIG. 10, when it is detected that the motion or behavior of a particular user meets preset conditions, the electrical power of the device image acquisition 421 can be turned on and an image 1010 can be captured. For example, in a case in which the user is staring at a particular target 1012, such as a person, the electrical power of the image acquisition device 421 is automatically turned on and the image 1010 can be captured. Here, the image acquisition device 421 can detect a movement of the user's pupil to detect the subject 1012 and capture the image 1010 to include the detected subject 1012. Referring to the second drawing of Fig. 10 when it is determined that the subject 1012 moves in the first drawing of Fig. 10, image capture guide information 1020 proposing the adjustment of a distance to the subject 1012 can be outputted.
[0041] For example, in a case in which the person 1012 walks toward the user, the image capture guide information 1020 requesting the user to maintain a greater distance to the person 1012 can be outputted. Here, a voice proposing the maintenance of the greater distance to the person 1012 can be output together. Alternatively, image capture guide information requesting the user to turn his head to change direction may be outputted. Referring to the third drawing of Fig. 10, as the user moves and the entire shape of the person 1012 is captured by the viewfinder, the image acquisition device 421 automatically captures an image 1030. Fig. 11 is a conceptual view illustrating an illustrative embodiment wherein a response of an image acquisition device according to a movement of a subject is provided as image capture guide information.
[0042] Referring to the first drawing of Fig. 11, when it is detected that the motion or behavior of a particular user satisfies preset conditions, the electrical power of the image acquisition device 421 is turned on and a image 1110 can be captured. For example, when the user stares at a moving object 1112, the electrical power of the image acquisition device 421 can be automatically turned on and the image 1110 can be captured. Here, the image acquisition device 421 can capture the image 1110 to include the detected subject 1112 in a movement of the user's pupil. Referring to the second drawing of Fig. 11, when it is determined that the subject 1112 moves in the first drawing of Fig. 11, the image acquisition device 421 can be rotated in the direction in which the subject 1112 moves, to capture images continuously. In detail, the image 1130 captured continuously by turning the image acquisition device 421 can be output in overlap with an image 1120 seen by the user through the glass 403 of the glasses type terminal 400. For this To do this, the image acquisition device 421 of the glasses-type terminal 400 can be implemented in a 360 degree rotation in accordance with a movement of a subject.
[0043] Referring to the third drawing of FIG. 11, when the entire subject 1112 is captured by the viewfinder according to the rotation of the image acquisition device 421 or positioned in the center of the viewfinder, the automatic image capture by the device image acquisition 421 is stopped.
[0044] In another illustrative embodiment, the image acquisition device 421 can be rotated continuously to automatically capture an image only when the entire subject 1112 is captured in the viewfinder, or to automatically capture, continuously, a image until subject 1112 disappears from the viewfinder.
[0045] Referring to the fourth drawing of Fig. 11, even after the automatic image capture, a message asking whether the image capture should be continued or a message 1150 asking whether a video should be captured can be outputted. Therefore, when the user control command to follow the output image capture guide information 1150 is input, the image acquisition device 421 can be rotated to continuously capture an image or a video. Here, even when the user's eyes are not turned to a subject, the image acquisition device 421 can be rotated to capture an image of the subject.
[0046] Subsequently, the user can enter an order to store the captured image. For example, in a case where the user blinks or touches the frame, the captured image may be stored. In another illustrative embodiment, when an order of the user for storing the captured image is not entered, an image capture situation may be re-analyzed and new capture guide information corresponding images can be proposed. In another illustrative embodiment, the images captured continuously according to the rotation of the image acquisition device 421 can be automatically stored even without the order of the user to store them.
[0047] According to at least one of the illustrative embodiments of the present description, the battery consumption of the glasses-type terminal can be reduced and a heating problem can be solved.
[0048] Also, according to at least one of the illustrative embodiments of the present description, the ergonomics of use of the glasses-type terminal can be improved by providing an optimized image capture condition. The present invention described above can be implemented as computer readable code in a medium in which a program is recorded. A computer readable medium includes any type of recording device in which data that can be read by a computer system is stored. A computer readable medium may be, for example, a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM , a magnetic tape, a floppy disk, an optical data storage device, and the like. A computer readable medium also includes implementations in the form of carrier waves (for example, transmission over the Internet). Also, the computer may include the controller 180 of the terminal. Thus, the foregoing detailed description should not be interpreted in a limited way and should be considered illustrative. The scope of the present invention is to be determined by a reasonable interpretation of the appended claims and each equivalent scope modification is included within the scope of the present invention. The foregoing embodiments and advantages are merely illustrative and should not be construed as limiting the present disclosure. The present teachings can be easily applied to other types of apparatus. This description is intended to be illustrative, and not to limit the scope of the claims. Those skilled in the art will realize that many other possibilities, modifications, and variations are possible. The features, structures, methods, and other features of the illustrative embodiments described herein may be combined in various ways to provide additional illustrative and / or other embodiments. Since the present features may be embodied in a number of forms without departing from the features thereof, it is also to be understood that the embodiments described above are not limited by any of the details of the foregoing description unless otherwise indicated. rather, they must be considered generally within the scope as defined in the appended claims, and therefore all changes and modifications within the limits and limits of the claims, or equivalents of such terminals. and limits, are therefore intended to be encompassed by the appended claims.
权利要求:
Claims (20)
[0001]
REVENDICATIONS1. A goggle type terminal, comprising: a mount (401, 402) configured to be worn on a user's head; an image acquisition device (421) configured to capture images; a display unit (451); and a control unit (480) configured to: activate the image acquisition device (421) when at least one of a plurality of preset conditions is satisfied, and cause the display unit (451) to ) displays image capture guide information according to an image obtained by the image acquisition device (421).
[0002]
The goggle type terminal of claim 1, wherein the control unit (480) is further configured to determine that the at least one of the plurality of preset conditions is satisfied based on at least one touch input. , voice input, or biometric information of the user.
[0003]
A goggle type terminal according to claim 1, wherein the control unit (480) is further configured to cause the image acquisition device (421) to capture the image in response to a command user control according to the image capture guide information.
[0004]
A goggle type terminal according to claim 1, wherein the image capture guide information includes an instruction for the user to visually follow a subject when motion of the subject is detected in the image. captured.
[0005]
A goggle type terminal according to claim 1, wherein: the control unit (480) is further configured to capture a plurality of consecutive images in a burst mode or a video mode, when the motion of a subject is detected in the image; andthe image capture guide information includes information to continue capturing the plurality of consecutive images.
[0006]
A goggle type terminal according to claim 5, wherein the information for continuing the capture of the plurality of consecutive images comprises at least: an instruction for the user to follow the subject visually, information relating to the rotation of the image acquisition device (421), or burst mode or video mode information.
[0007]
A goggle type terminal according to claim 5, wherein the control unit (480) is further configured to cause the image acquisition device (421) to stop capturing the plurality of consecutive images. when the subject is no longer included in the resulting image.
[0008]
A goggle type terminal according to claim 5, wherein the control unit (480) is further configured to cause the image acquisition device (421) to rotate to continue capturing the plurality of images. consecutive images of the subject in response to a user control command according to the image capture guide information.
[0009]
The goggle-shaped terminal of claim 1, wherein: the image capture guide information comprises an instruction for the user to move the image acquisition device (421) to include an object at the center of the image obtained; and the control unit (480) is further configured to cause the image acquisition device (421) to capture the image in response to a user control command according to the capture guide information. picture.
[0010]
A goggle type terminal according to claim 1, wherein: the image capture guide information comprises an instruction for the user to tilt the image acquisition device (421) to put the image obtained horizontally; andthe control unit (480) is further configured to cause the image acquisition device (421) to capture the image in response to a user control command according to the image capture guide information .
[0011]
A method for controlling a spectacle-type terminal worn on a user's head, the method comprising: activating an image acquisition device (421) of the terminal when at least one of a plurality of pre-set conditions is satisfied; displaying image capture guide information according to an image obtained by the image acquisition device (421).
[0012]
The method of claim 11, further comprising determining that the at least one of the plurality of preset conditions is satisfied based on at least one of a user's touch input, voice input, or biometric information. .
[0013]
The method of claim 11, further comprising capturing the image in response to a user control command according to the image capture guide information.
[0014]
The method of claim 11, wherein the image capture guide information includes an instruction for the user to visually follow a subject when movement of the subject is detected in the captured image.
[0015]
The method of claim 11, further comprising capturing a plurality of consecutive images in a burst mode or a video mode, when motion of a subject is detected in the image; wherein the image capture guide information includes information to continue capturing the plurality of consecutive images.
[0016]
The method of claim 15, wherein the information to continue capturing the plurality of consecutive images includes at least one instruction for the user to track the subject visually, information pertaining to the retrieval of the acquisition device. image (421), or information relating to the burst mode or the video mode.
[0017]
The method of claim 15, further comprising discontinuing capturing the plurality of consecutive images when the subject is no longer included in the resulting image.
[0018]
The method of claim 15, further comprising rotating the image acquisition device (421) to continue capturing the plurality of consecutive images of the subject in response to a user control command according to the guide information of image capture.
[0019]
The method of claim 11, wherein the image capture guide information comprises an instruction for the user to move the image acquisition device (421) to include an object in the center of the image capture device. image obtained; and the method further comprises capturing the image in response to a user control command according to the image capture guide information.
[0020]
The method of claim 11, wherein the image capture guide information comprises an instruction for the user to tilt the image acquisition device (421) to bring the resulting image to the horizontal; and the method further comprises capturing the image in response to a user control command according to the image capture guide information.
类似技术:
公开号 | 公开日 | 专利标题
FR3021133B1|2019-08-30|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3022368B1|2019-06-21|WATCH-TYPE TERMINAL AND CONTROL METHOD THEREOF
FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3031601B1|2019-08-30|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3021425A1|2015-11-27|
FR3026201A1|2016-03-25|
FR3022367A1|2015-12-18|
FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3022424A1|2015-12-18|GLASS-TYPE TERMINAL AND METHOD FOR CONTROLLING THE SAME
FR3022649A1|2015-12-25|
FR3021766A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3024786A1|2016-02-12|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME
FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3039673A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3021136A1|2015-11-20|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3021767A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME
FR3026202A1|2016-03-25|
FR3046470B1|2019-11-08|MOBILE TERMINAL
FR3019665A1|2015-10-09|
FR3042084B1|2019-11-08|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
US20180107360A1|2018-04-19|Mobile terminal and method for controlling the same
US10261671B2|2019-04-16|Mobile terminal and method for controlling the same
FR3021132A1|2015-11-20|
FR3022648A1|2015-12-25|
同族专利:
公开号 | 公开日
KR20150142516A|2015-12-22|
US20150362729A1|2015-12-17|
KR102240639B1|2021-04-15|
FR3022424B1|2019-04-26|
US9939642B2|2018-04-10|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
CN108965957A|2018-07-18|2018-12-07|北京零矩科技有限公司|A kind of control method of Mobile terminal accessory, device, client and system|US7038709B1|2000-11-01|2006-05-02|Gilbert Verghese|System and method for tracking a subject|
JP5017989B2|2006-09-27|2012-09-05|ソニー株式会社|Imaging apparatus and imaging method|
JP5537044B2|2008-05-30|2014-07-02|キヤノン株式会社|Image display apparatus, control method therefor, and computer program|
JP2011151614A|2010-01-21|2011-08-04|Nikon Corp|Electronic camera|
TWI408623B|2010-08-03|2013-09-11|Wistron Corp|Monitoring system and related recording methods for recording motioned image, and machine readable medium thereof|
KR101832959B1|2011-08-10|2018-02-28|엘지전자 주식회사|Mobile device and control method for the same|
US9223136B1|2013-02-04|2015-12-29|Google Inc.|Preparation of image capture device in response to pre-image-capture signal|US10712562B2|2016-05-11|2020-07-14|Hcl Technologies Limited|Immersive multimedia viewing|
CN106681503A|2016-12-19|2017-05-17|惠科股份有限公司|Display control method, terminal and display device|
US10419648B2|2017-04-08|2019-09-17|Opkix, Inc.|Magnetic camera coupling system|
WO2021107527A1|2019-11-29|2021-06-03|주식회사 카이비전|Multifunctional augmented reality glasses, modularized augmented reality glasses, and head motion detection augmented reality glasses|
KR20210137831A|2020-05-11|2021-11-18|삼성전자주식회사|Electronic apparatus and operaintg method thereof|
法律状态:
2016-06-28| PLFP| Fee payment|Year of fee payment: 2 |
2017-06-28| PLFP| Fee payment|Year of fee payment: 3 |
2018-06-22| PLFP| Fee payment|Year of fee payment: 4 |
2018-08-31| PLSC| Search report ready|Effective date: 20180831 |
2019-05-24| PLFP| Fee payment|Year of fee payment: 5 |
2020-05-07| PLFP| Fee payment|Year of fee payment: 6 |
2021-05-06| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
申请号 | 申请日 | 专利标题
KR1020140071601A|KR102240639B1|2014-06-12|2014-06-12|Glass type terminal and control method thereof|
KR1020140071601|2014-06-12|
[返回顶部]