![]() MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
专利摘要:
A mobile terminal including a view capturing apparatus; a touch screen configured to continuously display images captured by the image capture apparatus; a memory ; and a controller configured to, in response to a first touch applied to the touch screen continuously displaying the images at a first touch time applied, storing a first image displayed on the touch screen at the first touch time applied in the memory, displaying a first thumbnail image corresponding to the first image on the touch screen, in response to a second touch applied to the touch screen continuously displaying the images at a second touch time applied, storing a second image displayed on the touch screen at the instant of second touch applied in the memory, grouping the second image with the first image in the memory. 公开号:FR3043478A1 申请号:FR1654247 申请日:2016-05-12 公开日:2017-05-12 发明作者:Wanho Ju;Youngjun Lee;Seonhwi Cho;Donghoe Kim 申请人:LG Electronics Inc; IPC主号:
专利说明:
MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME The present invention relates to a mobile terminal capable of capturing video and storing images. A mobile terminal can record and reproduce videos, and display graphical user interfaces (GUIs) to implement the different functions of the terminal. Also, mobile terminals include laptops, cell phones, glasses and watches capable of displaying screen information, gaming machines and the like. As it becomes multifunctional, a mobile terminal can capture inanimate or animated images, play music or video files, play games, receive a broadcast and the like, to be implemented as an integrated media player. Many efforts include not only changes and improvements to structural components implementing a mobile terminal, but also software enhancement to support and enhance terminal functions. Recently, as a device for capturing views installed in a mobile terminal has improved performance and the needs of the users increase according to functions of the camera for capturing views, various functions of capturing videos and images are in progress. development. However, when a large amount of images are captured, specifically, when images included in a video are stored separately, an additional task of grouping these images is impractically required. Therefore, an object of the present invention is to address the above-noted and other problems of the related art. Another object of the present invention is to provide a method of storing images in a classified manner during the capture of a video. To achieve these and other advantages and in accordance with the purpose of the present invention, as shown and generally described herein, the present invention provides, in one aspect, a mobile terminal including a view capturing apparatus. ; a touch screen configured to continuously display images captured by the image capture apparatus; a memory ; and a controller configured to, in response to a first touch applied to the touch screen continuously displaying the images at a first touch time applied, storing a first image displayed on the touch screen at the first touch time applied in the memory, displaying a first thumbnail image corresponding to the first image on the touch screen, and, in response to a second touch applied to the touch screen continuously displaying the images at a second touch time applied, storing a second image displayed on the touch screen at the instant of second touch applied in the memory, grouping the second image with the first image in the memory. A further scope of applicability of the present application will become more apparent from the detailed description provided hereinafter. However, it is to be understood that the detailed description and specific examples, while indicating preferred embodiments of the invention, are provided by way of illustration only, as various changes and modifications within the mind and The scope of the invention will become apparent to those skilled in the art from the detailed description. The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments and, together with the description, serve to explain the principles of the invention. In the drawings: Fig. 1A is a block diagram of a mobile terminal according to an embodiment of the present invention; FIGS. 1B and 1C are conceptual views illustrating an example of a mobile terminal, viewed from different directions; Fig. 2A is a flowchart illustrating a method for controlling a mobile terminal in accordance with an embodiment of the present invention; Figs. 2B to 2C are conceptual views illustrating the control method of Fig. 2A; Figs. 3A and 3B are conceptual views illustrating a method of controlling image storage during video capture in accordance with an embodiment of the present invention; Figs. 4A-4D are conceptual views illustrating a method of controlling image storage in a categorized manner when discriminated subjects are detected; Fig. 5A is a flow chart illustrating a control method of controlling a function associated with a stored image; Figs. 5B to 5E are conceptual views each illustrating a method of controlling the revision of a video file and stored images in accordance with various embodiments described herein; Figs. 6A and 6B are conceptual views illustrating a method of controlling the individual selection of discriminatively stored images; Figs. 7A to 7C are conceptual views illustrating a method of controlling the revision of an image classification according to different embodiments; Figs. 8A-8D are conceptual views illustrating a method of controlling the modification of pre-stored image categories; and Figs. 9A-9C are conceptual views illustrating a method of controlling the supply of a video file and stored images. A description is now provided in detail according to embodiments described herein, with reference to the accompanying drawings. For the brevity of the description with reference to the drawings, the same or equivalent components may be provided with the same or like reference numerals, and the description thereof will not be repeated. In general, a suffix such as "module" and "unit" can be used to refer to elements or components. The use of such a suffix herein is merely intended to facilitate the description of the memoir, and the suffix itself is not intended to give any meaning or special function. In the present disclosure, what is well known to those of ordinary skill in the art has generally been omitted for brevity. The accompanying drawings are used to aid in easily understanding various technical features and it is to be understood that the embodiments presented herein are not limited by the accompanying drawings. As such, the present description should be construed as extending to any modifications, equivalents and substitutes in addition to those which are particularly shown in the accompanying drawings. Mobile terminals presented here can be implemented using a variety of different types of terminals. Examples of such terminals include cell phones, smart phones, user equipment, laptops, digital broadcast terminals, personal digital assistants (PDAs), portable media players (PMPs), browsers, laptops (PC), slate PCs, tablet PCs, ultra-portable computers, portable devices (eg, smart watches, smart glasses, head-mounted (HMD)), and the like. By way of nonlimiting example only, an additional description will be made with reference to particular types of mobile terminals. However, such teachings apply equally to other types of terminals, such as the types noted above. In addition, these teachings can also be applied to stationary terminals such as digital TVs, desktops, and the like. With reference to Figs. 1A-1C, Fig. 1A is a block diagram of a mobile terminal in accordance with an embodiment of the present invention, and Figs. 1B and 1C are conceptual views illustrating an example of a terminal. mobile, seen from different directions. The mobile terminal 100 may be shown having components such as a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an interface unit 160, a memory 170 , a controller 180, and an electrical power supply unit 190. The implementation of all of the illustrated components is not a necessity, and more or less components may alternatively be implemented. In more detail, the wireless communication unit 110 may typically include one or more modules that allow communications such as wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal, communications between the mobile terminal 100 and an external server. In addition, the wireless communication unit 110 may typically include one or more modules that connect the mobile terminal 100 to one or more networks. The wireless communication unit 110 may include one or more of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114, and a module of the present invention. location information 115. The input unit 120 may include a picture capture apparatus 121 or an image input unit for obtaining images or a video, a microphone 122, which is a type of picture recording device. audio input for inputting an audio signal; and a user input unit 123 (for example, a touch key, a mechanical key, and the like) to enable a user to enter information. Data (e.g., audio, video, image, and the like) can be obtained by the input unit 120 and can be analyzed and processed according to user commands. The sensor unit 140 may typically be implemented using one or more sensors configured to detect internal information of the mobile terminal, the surrounding environment of the mobile terminal, user information, and the like. For example, the detection unit 140 may include at least one of a proximity sensor 141, a lighting sensor 142, a touch sensor, an acceleration sensor, a magnetic sensor, a gravity sensor, a sensor gyro sensor, motion sensor, RGB sensor, infrared (IR) sensor, digital scanning sensor, ultrasonic sensor, optical sensor (e.g., 121 image capture apparatus), microphone 122, gauge a battery, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, a thermal sensor, and a gas sensor, among others), and a chemical sensor (for example, an electronic nose, a health sensor, a biometric sensor, and the like). The mobile terminal described herein may be configured to use information obtained from one or more sensors of the detection unit 140, and combinations thereof. Output unit 150 may typically be configured to output various types of information, such as audio, video, touch output, and the like. The output unit 150 may be shown having at least one of a display unit 151, an audio output module 152, a haptic module 153, and an optical output module 154. The display unit 151 may have an inter-layer structure or integrated structure with a touch sensor to facilitate a touch screen. The touch screen may provide an output interface between the mobile terminal 100 and a user, as well as serve as a user input unit 123 which provides an input interface between the mobile terminal 100 and the user. The interface imitance 160 serves as an interface to various types of external devices that may be coupled to the mobile terminal 100. The interface unit 160, for example, may include any of wired or wireless ports. , external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, input / output ports (I / O) audio, video I / O ports, headset ports, and the like. In some cases, the mobile terminal 100 may perform matching control functions associated with a connected external device, in response to the external device connected to the interface unit 160. The memory 170 is typically implemented to store data to support various functions or features of the mobile terminal 100. For example, the memory 170 may be configured to store application programs executed in the mobile terminal 100, data or data. instructions for operations of the mobile terminal 100, and the like. Some of these application programs can be downloaded from an external server via wireless communication. Other application programs may be installed inside the mobile terminal 100 during manufacture or shipping, which is typically the case for basic functions of the mobile terminal 100 (for example, the reception of the mobile terminal). a call, making a call, receiving a message, sending a message, and the like). It is common for application programs to be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform an operation (or function) for the mobile terminal 100. The controller 180 typically operates to control the overall operation of the mobile terminal 100, in addition to the operations associated with the application programs. The controller 180 may provide or process information or functions appropriate for a user by processing signals, data, information and the like, which are input or output by the various aforementioned components, or by activating application programs stored in the memory 170. Also, the controller 180 controls some or all of the components shown in Fig. IA according to the execution of an application program that has been stored in the memory 170. In addition, the controller 180 can control at least two of these components. components included in the mobile terminal to activate the application program. The electric power supply unit 190 may be configured to receive external electrical power or to provide internal electrical power to provide appropriate electrical power required to operate elements and components included in the terminal The power supply unit 190 may include a battery, and the battery may be configured to be incorporated into the terminal body, or configured to be separable from the terminal body. At least a portion of the components may operate cooperatively to implement an operation, control or method of controlling a mobile terminal in accordance with various embodiments described herein. Also, the operation, the control or the control method of the mobile terminal can be implemented on the mobile terminal by activation of at least one application program stored in the memory 170. Hereinafter, a description will be given in more detail of the aforementioned components with reference to FIG. 1A, before describing various embodiments implemented via the mobile terminal 100. First, with respect to wireless communication unit 110, the broadcast receiving module 111 is typically configured to receive a broadcast signal and / or broadcast-related information from an external broadcast management entity through a broadcast channel. The broadcast channel may include a satellite channel, a terrestrial channel, or both. In some embodiments, two or more broadcast receiving modules 111 may be used to facilitate the simultaneous reception of two or more broadcast channels, or to support switching among broadcast channels. The mobile communication module 112 may transmit and / or receive wireless signals to one or more and from one or more network entities. Typical examples of a network entity include a base station, an external mobile terminal, a server, and the like. Such network entities are part of a mobile communication network, which is built according to technical standards or methods of communication for mobile communications (eg global mobile communication system (GSM), distributed division multiple access). coding (CDMA), code division multiple access (CDMA 2000), broadband code division multiple access (WCDMA), high speed downlink packet access (HSDPA), uplink packet access to broadband (HSUPA), long-term evolution (LTE) technology, long-term advanced-evolution (LTE-A) technology, and the like). Examples of wireless signals include audio calling signals, video calling (telephony) signals, or various data formats for supporting the communication of text and multimedia messages. The wireless Internet module 113 is configured to facilitate wireless Internet access. This module can be coupled internally or externally to the mobile terminal 100. The wireless Internet module 113 can transmit and / or receive wireless signals via communication networks according to wireless Internet technologies. Examples of such wireless Internet access include wireless LAN (WLAN), Wi-Fi, Wi-Fi Direct, DLNA, Wi-Fi, WiMAX, HSDPA, HSUPA, LTE, LTE-A, and like. The wireless Internet module 113 may transmit / receive data according to one or more of such wireless Internet technologies, and other Internet technologies as well. In some embodiments, when wireless Internet access is implemented according to, for example, WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LET-A, and the like, as part of a Mobile communication network, the wireless Internet module 113 performs such wireless Internet access and can be heard as a type of mobile communication module 112. The short-range communication module 114 is configured to facilitate short-distance communications. Appropriate technologies for implementing such short-range communications include BLUETOOTH ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication (NFC), Wi-Fi Fi, Wi-Fi Direct, and Universal Serial Bus, and the like. The short-distance communication module 114 generally supports wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal 100, or communications between the mobile terminal and the mobile terminal. a network where another mobile terminal 100 (or an external server) is located, via wireless networks. An example of wireless networks is a wireless personal network. Here, another mobile terminal (which can be similarly configured as mobile terminal 100) may be a portable device, for example, a smart watch, smart glasses or a headset (HMD), which can exchange data with the terminal. mobile 100 (or otherwise cooperate with the mobile terminal 100). The short-distance communication module 114 can detect or recognize the portable device, and allow communication between the portable device and the mobile terminal 100. In addition, when the detected portable device is a device that is authenticated to communicate with the mobile terminal 100, the controller 180, for example, can cause at least a portion of processed data to be transmitted in the mobile terminal 100 to the portable device via the short-distance communication module 114. Thus, a user of the portable device can use the processed data in the mobile terminal 100 on the portable device. For example, when a call is received in the mobile terminal 100, the user can answer the call using the portable device. Also, when a message is received in the mobile terminal 100, the user can check the received message using the portable device. The location information module 115 is generally configured to detect, calculate, derive or otherwise identify a position (or current position) of the mobile terminal. For example, the location information module 115 includes a global positioning system (GPS) module, a Wi-Fi module, or both. For example, when the mobile terminal uses a GPS module, a position of the mobile terminal can be acquired using a signal sent from a GPS satellite. As another example, when the mobile terminal uses the Wi-Fi module, a position of the mobile terminal can be acquired based on information related to a wireless access point (AP) which transmits or receives a signal without at or from the Wi-Fi module. If desired, the location information module 115 may alternatively or additionally operate with any of the other modules of the wireless communication unit 110 for obtain data related to the position of the mobile terminal. The location information module 115 is a module used to acquire the position (or the current position) and is not limited to a module for directly calculating or acquiring the position of the mobile terminal. The input unit 120 may be configured to allow various types of inputs to the mobile terminal 120. Examples of such inputs include audio, picture, video, data, and user input. An image and video input is often obtained using one or more image capturing apparatuses 121. Such image capturing devices 121 can process image views of inanimate or video images obtained by image sensors. in a video or image capture mode. The processed image views may be displayed on the display unit 151 or stored in the memory 170. In addition, the image capturing devices 121 may be arranged in a matrix configuration to accommodate a plurality of images. having various angles or focal points to be input to the mobile terminal 100. Also, the image capturing devices 121 may be located in a stereoscopic arrangement for acquiring left and right images for implementing a stereoscopic image. The microphone 122 processes an external audio signal into electrical audio (sound) data. The processed audio data may be processed in a variety of ways according to a function performed in the mobile terminal 100. If desired, the microphone 122 may include matching noise elimination algorithms to eliminate unwanted noise generated during reception. external audio signal. The user input unit 123 is a component that allows input by a user. Such user input may enable the controller 180 to control the operation of the mobile terminal 100. The user input unit 123 may include one or more of a mechanical input element (for example, a mechanical key, a button located on a front and / or rear surface or side surface of the mobile terminal 100, a curved switch, a pulse wheel, a pulse switch, and the like), or a touch input element, among others. For example, the touch input element can be a virtual key, a programmable key or a visual key, which is displayed on a touch screen by software processing, or a touch key which is located on the mobile terminal at a location that is other than the touch screen. In addition, the virtual key or the visual key can be displayed on the touch screen in various forms, for example, graphic, textual, iconic, video, or an association thereof. The detection unit 140 is generally configured to detect one or more of internal information of the mobile terminal, surrounding environment information of the mobile terminal, user information, or the like, and generate a corresponding detection signal. The controller 180 generally cooperates with the detection unit 140 to control the operation of the mobile terminal 100 or to execute a data processing, a function or an operation associated with an application program installed in the mobile terminal as a function of the detection signal . The detection imity 140 may be implemented using any of a variety of sensors, some of which will now be described in more detail. The proximity sensor 141 refers to a sensor for detecting the presence or absence of an object approaching a surface, or an object located near a surface, using an electromagnetic field, infrared rays , or the like without mechanical contact. The proximity sensor 141 may be arranged in an interior region of the mobile terminal covered by the touch screen, or near the touch screen. The proximity sensor 141, for example, may include any of a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency vibration proximity sensor, a capacitive proximity sensor. , a magnetic proximity sensor, an infrared proximity sensor, and the like. When the touch screen is implemented in capacitive form, the proximity sensor 141 can detect the proximity of a pointer relative to the touch screen through changes in an electromagnetic field, which responds to an approach of an object with conductivity. In this case, the touch screen (touch sensor) can also be categorized as a proximity sensor. The term "proximity touch" will often be referred to herein to denote the scenario in which a pointer is positioned to be close to the touch screen without touching the touch screen. The term "contact touch" will often be referred to herein to denote the scenario in which a pointer makes physical contact with the touch screen. For the position corresponding to the touch in proximity of the pointer relative to the touch screen, such a position will correspond to a position where the pointer is perpendicular to the touch screen. Proximity sensor 141 can capture a touch in proximity, and touch profiles in proximity (e.g., distance, direction, speed, time, position, state of motion, and the like). In general, the controller 180 processes data corresponding to proximity touches and proximity touch profiles detected by the proximity sensor 141, and causes visual information to be output to the touch screen. In addition, the controller 180 may control the mobile terminal 100 to perform different operations or process different data (or information) depending on whether a touch relative to a point on the touch screen is a touch in proximity or a touch to contact. A touch sensor can detect a touch (or touch input) applied to the touch screen, such as the display unit 151, using any of a variety of touch methods. Examples of such tactile methods include a resistive type, a capacitive type, an infrared type, and a magnetic field type, among others. For example, the touch sensor may be configured to convert pressure changes applied to a specific portion of the display unit 151, or to convert a capacitance occurring on a specific portion of the display unit 151. , as electrical input signals. The touch sensor can also be configured to detect not only an affected position and an affected area, but also tactile pressure and / or tactile ability. A touch object is typically used to apply tactile input to the touch sensor. Examples of typical touch objects include a finger, a touch pen, a stylus, a pointer, or the like. When a touch input is detected by a touch sensor, corresponding signals can be transmitted to a touch controller. The touch controller can process the received signals, and then transmit corresponding data to the controller 180. Therefore, the controller 180 can detect the region of the display unit 151 that has been touched. Here, the touch controller may be a separate component of the controller 180, the controller 180, and combinations thereof. In addition, the controller 180 may execute the same or different controls depending on a type of touch object that touches the touch screen or a touch key provided in addition to the touch screen. The fact that the same control or a different control must be executed depending on the object that provides a touch input can be decided according to a current state of operation of the mobile terminal 100 or a currently running application program. for example. The touch sensor and the proximity sensor can be implemented individually, or in combination, to detect various types of touch. Such touches include a short touch, a long touch, a multi-touch, a touch-and-shoot, a quick touch, a close-in pinch, a toe-pinch, a touch-and-slide, a floating feel, and the like. If desired, an ultrasonic sensor may be implemented to recognize location information about a touch object using ultrasonic waves. The controller 180, for example, can calculate a position of a wave generation source based on information detected by a lighting sensor and a plurality of ultrasonic sensors. As the light is much faster than ultrasonic waves, the time for the light to reach the optical sensor is much shorter than the time for the ultrasonic wave to reach the ultrasonic sensor. The position of the wave generation source can be calculated using this fact. For example, the position of the wave generation source can be calculated using the time difference versus time for the ultrasonic wave to reach the sensor as a function of light as a reference signal. The image capture apparatus 121, which has been illustrated as a component of the input unit 120, typically includes at least one image capture apparatus sensor (CCD, CMOS etc.), a photo-sensor sensor (or image sensors), and a laser sensor. The implementation of the image capture apparatus 121 with a laser sensor can enable the detection of a touch of a physical object with respect to a 3D stereoscopic image. The photo-sensor may be laminated on, or partly covered by, the display device. The photo sensor can be configured to scan a movement of the physical object near the touch screen. In more detail, the photo-sensor may include photodiodes and row and column transistors for scanning content received in the photo-sensor using an electrical signal that changes according to the amount of light applied. Namely, the photo-sensor can calculate the coordinates of the physical object according to a variation of light to thereby obtain location information of the physical object. The display unit 151 is generally configured to output processed information in the mobile terminal 100. For example, the display unit 151 may display run screen information of an application program executed in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information in response to the execution screen information. Also, the display unit 151 may be implemented as a stereoscopic display unit for displaying stereoscopic images. A typical stereoscopic display unit may use a stereoscopic display system such as a stereoscopic system (a telescope system), an auto-stereoscopic system (a system without a telescope), a projection system (holographic system), or the like . The audio output module 152 is generally configured to output audio data. Such audio data may be obtained from any one of a number of different sources, so that the audio data may be received from the wireless communication unit 110 or may have been stored in the memory 170. The audio data may be output during modes such as a signal receiving mode, a calling mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like . The audio output module 152 may provide an audible output related to a particular function (e.g., a call signal receiving sound, a message receiving sound, etc.) performed by the mobile terminal 100. Audio output 152 may also be implemented as a receiver, loudspeaker, horn, or the like. A haptic module 153 may be configured to generate various tactile effects that a user feels, perceives, or otherwise experiences. A typical example of a tactile effect generated by the haptic module 153 is vibration. The intensity, profile and the like of the vibration generated by the haptic module 153 can be controlled by user selection by controller adjustment. For example, the haptic module 153 can output different vibrations in a combined or sequential manner. In addition to vibration, the haptic module 153 can generate various other tactile effects, including a pacing effect such as a vertically movable hair pin arrangement for contacting the skin, a spraying force, or a suction force. air through a jet orifice or suction opening, a touch on the skin, an electrode contact, an electrostatic force, an effect by reproducing the feeling of cold or heat by using an element that can absorb or generate heat, and the like. The haptic module 153 can also be implemented to allow the user to feel a tactile effect via a muscular sensation, for example by the user's fingers or arms, as well as to transfer the effect tactile by direct contact. Two or more haptic modules 153 may be provided according to the particular configuration of the mobile terminal 100. An optical output module 154 may output a signal to indicate event generation using light from a light source. Examples of events generated in the mobile terminal 100 may include message reception, call waiting, a missed call, an alarm, a calendar notification, an e-mail reception, a reception of information through an application, and the like. A signal outputted from the optical output module 154 may be implemented for the mobile terminal to emit monochrome light or light with a plurality of colors. The output signal may be terminated when the mobile terminal detects that a user has verified the generated event, for example. The interface unit 160 serves as an interface for external devices intended to be connected to the mobile terminal 100. For example, the interface unit 160 may receive data transmitted from an external device, receive data from the device, and receive data from the device. electrical energy to be transferred to elements and components within the mobile terminal 100, or to transmit internal data of the mobile terminal 100 to such an external device. The interface unit 160 may include wired or wireless headphone ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, audio input / output (I / O) ports, video LO ports, headset ports, or the like. The identification module may be a chip that stores various information to authenticate the authority of the use of the mobile terminal 100 and may include a user identification module (UIM), a subscriber identification module (SIM) , and a Universal Subscriber Identification Module (USIM), and the like. In addition, the device having the identification module (also referred to herein as "identification device") may take the form of a smart card. Therefore, the identification device can be connected to the terminal 100 via the interface unit 160. When the mobile terminal 100 is connected to an external dock, the interface unit 160 can be used as a passageway to allow electrical power from the docking station to be supplied to the mobile terminal. 100 or can be used as a gateway to allow various control signals input by user, from the docking station, to be transferred to the mobile terminal via it. Various control signals or the input of electrical power from the docking station can serve as signals to recognize that the mobile terminal is properly mounted on the docking station. The memory 170 may store programs to support operations of the controller 180 and store input / output data (eg, directory, messages, inanimate images, videos, etc.). The memory 170 can store related data to various vibration and audio profiles that are outputted in response to touch inputs on the touch screen. The memory 170 may include one or more types of storage media, including a flash memory type, a hard disk type, a semiconductor disk type (S SD), a silicon disk type (S SD), a type multimedia microcard, card-type memory (e.g., SD or XD memory, etc.), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), erasable and electrically programmable read only memory (EEPROM), a programmable read only memory (PROM), a magnetic memory, a magnetic disk, and an optical disk, and the like. The mobile terminal 100 may also be operated with respect to a network storage device that performs the storage function of the memory 170 through a network, such as the Internet. The controller 180 may typically control operations relating to application programs and general operations of the mobile terminal 100. For example, the controller 180 may set or release a lock state to prevent a user from entering a control command relative to to applications when a state of the mobile terminal satisfies a preset condition. The controller 180 may also perform control and processing associated with voice calls, data communications, video calls, and the like, or perform profile recognition processing to recognize a handwritten input or a drawing input made on the device. touch screen in the form of characters or images, respectively. In addition, the controller 180 may control one or a combination of these components to implement various embodiments described herein. The electric power supply unit 190 receives external electrical energy or provides internal electrical energy and provides the appropriate electrical energy required to operate respective elements and components included in the mobile terminal 100. The Power supply unit 190 may include a battery, which is typically rechargeable or is releasably coupled to the terminal body for charging. The power supply unit 190 may include a connection port. The connection port may be configured, for example, from the interface unit 160 to which an external charger, to provide electrical power to recharge the battery, is electrically connected. As another example, the power supply unit 190 may be configured to recharge the wireless battery without using the connection port. In this example, the electric power supply unit 190 can receive electrical energy, transferred from an external wireless electrical energy transmitter, using at least one of a connection method to induction that is based on magnetic induction or a magnetic resonance coupling method that is based on electromagnetic resonance. Various embodiments described herein may be implemented in a computer readable medium, a machine readable medium, or similar medium using, for example, software, hardware, or any combination thereof. Referring now to FIGS. 1B and 1C, the mobile terminal 100 is described with reference to a bar-type terminal body. However, the mobile terminal 100 may alternatively be implemented in any of a variety of different configurations. Examples of such configurations include a watch type, a bar type, a bezel type, or as a folding type, a flap type, a sliding type, a swing type, a swivel type in which two or more bodies are relatively movably associated with each other, and combinations thereof. The description herein will often relate to a particular type of mobile terminal. However, such teachings with respect to a particular type of mobile terminal will generally apply to other types of mobile terminals as well. The mobile terminal 100 will generally include a housing (eg, frame, housing, cover, and the like) forming the appearance of the terminal. In the present embodiment, the housing is formed using a front housing 101 and a rear housing 102. Various electronic components are incorporated in a space formed between the front housing 101 and the rear housing 102. At least one middle housing can in addition to being positioned between the front housing 101 and the rear housing 102. The display unit 151 is shown located on the front side of the terminal body for outputting information. As illustrated, a window 151a of the display unit 151 may be mounted on the front housing 101 to form the front surface of the terminal body together with the front housing 101. In some embodiments, electronic components may also be mounted on the back box 102. Examples of such electronic components include a separable battery, an identification module, a memory card, and the like. A rear cover 103 is shown covering the electronic components, and this cover can be releasably coupled to the rear housing 102. Thus, when the rear cover 103 is separated from the rear housing 102, the electronic components mounted on the rear housing 102 are exposed. externally. As illustrated, when the rear cover 103 is coupled to the rear housing 102, a side surface of the rear housing 102 is partially exposed. In some cases, when mating, the rear housing 102 may also be completely protected by the rear cover 103. In some embodiments, the rear cover 103 may include an opening for externally exposing a 121b view capturing apparatus or an audio output module 152b. The housings 101, 102, 103 may be formed by injection molding a synthetic resin or may be formed of a metal, for example, stainless steel (STS), aluminum (Al), titanium (Ti), or the like. As an alternative to the example in which the plurality of housings form an interior space for housing components, the mobile terminal 100 may be configured such that a housing forms the interior space. In this example, a mobile terminal 100 having a uni-body is formed so that synthetic resin or metal extends from a side surface to a back surface. If desired, the mobile terminal 100 may include a water seal unit to prevent the introduction of water into the terminal body. For example, the water seal unit may include a water seal member that is located between the window 151a and the front housing 101, between the front housing 101 and the rear housing 102, or between the rear housing 102 and the rear cover 103, for sealing an interior space when these housings are coupled. The mobile terminal 100 may include a display unit 151, first and second audio output modules 152a and 152b, a proximity sensor 141, a lighting sensor 142, an optical output module 154, first and second devices 121a and 121b views capture, first and second handling units 123a and 123b, a microphone 122, an interface unit 160, and the like. Hereinafter, as illustrated in Figures 1B and 1C, a description will be provided of the mobile terminal 100 in which the front surface of the terminal body is shown having the display unit 151, the first audio output module. 152a, the proximity sensor 141, the illumination sensor 142, the optical output module 154, the first image capture apparatus 121a, and the first manipulation unit 123a, the lateral surface of the terminal body is shown having the second handling unit 123b, the microphone 122, and the interface unit 160, and the rear surface of the terminal body is shown having the second audio output module 152b and the second image capture apparatus 121b. However, these components are not limited to the arrangement. Some components may be omitted or rearranged or located on different surfaces. For example, the first handling unit 123a may be located on another surface of the terminal body, and the second audio output module 152b may be located on the lateral surface of the terminal body other than the rear surface of the terminal body. . The display unit 151 outputs processed information in the mobile terminal 100. For example, the display unit 151 may display run screen information of an application program running in the terminal mobile 100 or user interface (UI) and graphical user interface (GUI) information in response to the execution of screen information. The display unit 151 may be implemented using one or more suitable display devices. Examples of such suitable display devices include a liquid crystal display (LCD) screen, a thin film transistor (TFT-LCD) liquid crystal display, an organic light-emitting diode (OLED), a flexible display screen, a three-dimensional (3D) display screen, an electronic ink display screen, and associations thereof. For example, a plurality of display units 151 may be arranged on one side, spaced from each other, or these devices may be integrated, or these devices may be arranged on different surfaces. The display unit 151 may also include a touch sensor that detects a touch input received on the display unit. When a touch is input to the display unit 151, the touch sensor can be configured to detect that touch and the controller 180, for example, can generate a control command or other signal corresponding to the touch. Content that is entered in a tactile manner can be a textual or numeric value, or a menu item that can be specified or designated in various modes. The touch sensor may be configured as a film having a tactile configuration, disposed between the window 151a and a display screen on a rear surface of the window 151a, or a wire which is arranged in configuration directly on the back surface of the window. the window 151a. Alternatively, the touch sensor can be integrally formed with the display screen. For example, the touch sensor may be disposed on a substrate of the display screen or within the display screen. The display unit 151 may also form a touch screen together with the touch sensor. Here, the touch screen can serve as a user input unit 123 (see Figure IA). Thus, the touch screen can replace at least some of the functions of the first handling unit 123a. The first audio output module 152a can be implemented as a receiver for transferring call-to-user sounds and the second audio output module 152b can be implemented as a speaker. to output alarm sounds, multimedia audio reproduction, and the like. The window 151a of the display unit 151 will typically include an aperture to allow the audio generated by the first audio output module 152a to pass. An alternative is to allow audio to be released along an assembly gap between the structural bodies (for example, a gap between the window 151a and the front housing 101). In this case, an independently formed hole for outputting audio sounds may not be seen or otherwise camouflaged in appearance, thus further simplifying the appearance and fabrication of the mobile terminal 100. The optical output module 154 may be configured to output light to indicate event generation. Examples of such events include message reception, call waiting reception, missed call, alarm, calendar alarm, e-mail reception, receipt of information via a application, and the like. When a user has verified a generated event, the controller 180 can control the optical output module 154 to stop the light output. The first image capture apparatus 121a may process image views such as inanimate or animated images obtained by the image sensor in a capture mode or a video call mode. The processed image views can then be displayed on the display unit 151 or stored in the memory 170. The first and second handling units 123a and 123b are examples of the user input unit 123, which can be manipulated by a user to provide an input to the mobile terminal 100. The first and second handling units 123a and 123b can also be commonly referred to as a handling portion, and may utilize any tactile method that allows the user to perform manipulation such as touch, push, scroll, or the like. The first and second handling units 123a and 123b may also utilize any non-tactile method that allows the user to perform manipulation such as proximity touch, flutter, or the like. FIG. 1B illustrates the first manipulation unit 123a in the form of a touch key, but any variants include a push button (or mechanical), a touch key, and combinations thereof. The input received on the first and second handling units 123a and 123b can be used in various ways. For example, the first handling unit 123a can be used by the user to provide menu entry, a home key, a cancel, a search, or the like, and the second handling unit 123b can be used by the user for providing an input for controlling a volume level outputted from the first or second audio output modules 152a or 152b, to switch to a touch recognition mode of the display unit 151, or the like. As another example of the user input unit 123, a rear input unit may be located on the rear surface of the terminal body. A rear input unit may be manipulated by a user to provide an input to the mobile terminal 100. The input may be used in a variety of different ways. For example, a rear input unit may be used by the user to provide an input for power on / off, start, end, scroll, control of the volume level produced by the user. outputting from the first or second audio output modules 152a or 152b, switching to a touch recognition mode of the display unit 151, and the like. A rear input unit may be configured to allow touch input, push input, or combinations thereof. The rear input unit may be located to overlap the display unit 151 on the front side in a thickness direction of the terminal body. For example, the rear input unit may be located on an upper end portion of the rear side of the terminal body so that a user can easily manipulate it using an index when the user grasps the terminal body with a hand. Alternatively, the rear input unit may be positioned at any position on the rear side of the terminal body. When the rear input unit is provided on the rear surface of the terminal body, a new type of user interface using it can be implemented. Embodiments that include the rear input unit may implement some or all of the functionality of the first handling unit 123a in the rear input unit. As such, in situations where the first handling unit 123a is omitted from the front side, the display unit 151 may have a larger screen. As a further alternative, the mobile terminal 100 may include a digital scanning sensor that scans a user's fingerprint. The controller 180 can then use fingerprint information detected by the digital scan sensor as part of an authentication procedure. The digital scanning sensor may also be installed in the display unit 151 or implemented in the user input unit 123. The microphone 122 is shown at one end of the mobile terminal 100, but other locations are possible. If desired, multiple microphones can be implemented, with such an arrangement for receiving stereo sounds. The interface unit 160 may serve as a path for the mobile terminal 100 to interface with external devices. For example, the interface unit 160 may include one or more of a connection terminal to connect to another device (e.g., a headset, an external speaker, or the like), a port for communication in the near field (e.g., an IrDA port, a Bluetooth port, a wireless LAN port, and the like), or an electrical power supply terminal for supplying electrical power to the mobile terminal 100. interface 160 may be implemented as a socket for housing an external card, such as a subscriber identification module (SIM), or a user identification module (UIM), or a card memory for storing information. As shown in Fig. 1C, the second image capture apparatus 121b is shown located on the back side of the terminal body and includes an image capture direction which is substantially opposite to the image capture direction of the first camera unit 121a. The second image capture apparatus 121b may include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix configuration. View capturing devices may be referred to as "networked image capture devices". When the second image capture apparatus 121b is implemented as a networked image capture apparatus, images can be captured in a variety of ways using the plurality of lenses and higher quality images are obtained. A flash 124 is shown adjacent to the second image capture apparatus 121b. When an image of a subject is captured with the image capture apparatus 121b, the flash 124 may illuminate the subject. The second audio output module 152b may be located on the terminal body. The second audio output module 152b may implement stereophonic sound functions in conjunction with the first audio output module 152a, and may also be used to implement a speakerphone call-to-talk mode. At least one antenna for wireless communication may be located on the terminal body. The antenna can be installed in the terminal body or formed by the housing. For example, an antenna that configures a portion of the broadcast receiving module 111 (see Fig. 1A) may be retractable into the terminal body. Alternatively, an antenna may be formed using a film attached to an inner surface of the back cover 103, or a housing that includes a conductive material. An electrical power supply unit 190 for supplying electrical power to the mobile terminal 100 may include a battery 191, which is mounted in the terminal body or releasably coupled to an outside of the terminal body. The battery 191 can receive electrical energy via an electrical power source cable connected to the interface unit 160. Also, the battery 191 can be wirelessly recharged using a wireless charger. . The wireless charge can be implemented by magnetic induction or electromagnetic resonance. The rear cover 103 is shown coupled to the rear housing 102 to protect the battery 191, to prevent separation of the battery 191, and to protect the battery 191 from external impact or foreign matter. When the battery 191 is separable from the terminal body, the rear housing 103 can be separably coupled to the rear housing 102. An accessory for protecting an appearance or assisting or improving the functions of the mobile terminal 100 may also be provided on the mobile terminal 100. As an example of an accessory, a cover or cover for covering or housing at least one surface of the mobile terminal 100 may be provided. The lid or pouch may cooperate with the display unit 151 to enhance the function of the mobile terminal 100. Another example of the accessory is a touch pen to assist or enhance a touch input on a touch screen. Now, FIG. 2A is a flowchart illustrating a method for controlling a mobile terminal in accordance with an embodiment of the present invention, and FIGS. 2B through 2C are conceptual views illustrating the control method of FIG. 2A. As shown in Figures 2A and 2B, the controller 180 may output continuous images captured by the image capture apparatus 121 to the display unit 151 (S210). In particular, the controller 180 may perform a video capture function when a video capture control command is applied, and continuously output images obtained by the image capture apparatus 121 to the video capture unit. display 151. As the continuous images outputted on the display unit 151 constitute a video file, a user can check or view these images constituting the video file on the display unit 151 in real time. In addition, the continuous images may correspond to visual information, i.e., preview images, which are obtained by the image capture apparatus 121 before a control command for image capture and video capture is applied. after the camera 121 is activated. The controller 180 may also control the memory 170 to temporarily store images obtained after the control command is applied, and generate (or form) a video file using the images 310. In more detail, the controller 180 may generate a video file using a plurality of images, which are captured until a time of receipt of an applied stop command to stop the video capture. When the video capture function is executed, the display unit 151 displays a control icon 301 for controlling the video capture function on the images 310 obtained by the image capture apparatus. As shown, the control icon 301 may include a first icon 301a for stopping the video capture and generating a video file, a second icon 301b for stopping the video capture, and a third icon 301c for storing output images. on the display unit 151. The controller 180 may also display a capture time 30ld indicating an execution time of the video capture function on an area of the display unit 151. The capture time 301d may not change when a touch is applied to the video capture function. second icon 301b. In this case, the display unit 151 may continuously output a preview image obtained by the image capture apparatus 121. In addition, the controller 180 can store a first output image on the display unit 151 at a point at which touch is applied to the display unit 151 (S220). The first image corresponds to an image obtained by the image capture apparatus 121 and output to the display unit 151 at the applied touch time. The first image is also an image included in the video file. Here, the touch applied to the display unit 151 may be applied to an image outputted to the display unit 151 or the third icon 301c. Controller 180 may also store the first image in memory 170 in response to touch. In another example, the controller 180 can store images obtained by the image capture apparatus 121 in a period of time, regardless of the touch applied. In addition, the controller 180 may display a thumbnail image 411 corresponding to the first image on an area of the display unit 151 in response to touch (S230). The display unit 151 may also continue to display an image obtained by the real-time image capture apparatus in conjunction with the thumbnail image 411. In more detail, the controller 180 may store an additional image in the memory 170, in response to a touch applied to the display unit 151. In this case, the controller 180 may display a second thumbnail image 412 corresponding to the additional image, adjacent the first thumbnail image 411. That is, the user can store images, obtained by the image capture apparatus during video capture, in memory 170, in response to touch, regardless of a stored video file. In addition, the controller 180 can store a second image in the memory 170, which is outputted at a time when a touch is applied to the thumbnail image, in association with the first image. For example, the first and second images can be included in the same folder, in order to be discriminated from a folder in which the video file is stored. In addition, the display unit 151 may display a third thumbnail image 413 corresponding to the second image and display a number (2) of associated stored images on the third thumbnail image 413. Similarly, when a touch is applied in addition to the third thumbnail image 413, three images can be stored in a related category. Further, FIG. 2B illustrates that a thumbnail image of the most recently stored image is displayed on the display unit 151, in response to a touch applied to the thumbnail image, but the present invention is not not limited to this. For example, when there is an image stored in association with an image of a thumbnail image, a superimposed (layered) image with the thumbnail image can be displayed or a thumbnail image of the stored image at all. first can be displayed continuously. In this embodiment, the user can independently store an image by applying a touch while capturing a video, and store a plurality of images in an associated manner by applying touches to thumbnail images of the stored images. Therefore, the user can store the plurality of images obtained during the video capture by grouping the plurality of images. Since the plurality of images can be discriminatively stored in different categories during video capture, the user should not perform a method of grouping the plurality of images. Now, Figure 2C is a conceptual view illustrating a control method when video capture is complete. When a touch is applied to the first icon 301a, the controller 180 may terminate (stop) the video capture function. That is, the controller 180 can control the image capture apparatus 121 to turn it off, or control the display unit 151 to just continuously output images obtained by the image capture apparatus 121 but not save the images. In addition, when touch is applied to the first icon 301a, the controller 180 may generate a video file using the plurality of images obtained. In particular, the display unit 151 may output a preview screen 300 corresponding to an image of the video file. The preview screen 300 may also include a graphical image for receiving a touch to play back (reproduce) the video file, a progress bar 302 for indicating a duration of reproduction of the video file, storage points 303 each indicating a stored time images together with the progress bar 302 while the capture function is executed, and thumbnail images 401 corresponding to the stored images. In addition, a number of the storage points 303 may be substantially the same as a number of the stored images. Thus, storage points 303 allow the user to recognize stored moments of images within the video file playback time, and a number of images stored together with the video file. In addition, the thumbnail images 401 can be arranged according to a stored period of time. In the drawing, the plurality of thumbnail images 401 are arranged in series to be adjacent to each other along the lapse of their stored time. However, the output method of thumbnail images is not limited to this. Thumbnail images 401 may also be outputted overlapping or spaced according to the stored time. In addition, when the video file is generated, the display unit 151 may output a control image 510 for controlling the video file and the images. For example, the control image 510 may include a first control icon 511 for sharing a stored image, a second control icon 512 for sharing information about a selected view of the video, and a third control icon 513 for sharing. a video of a selected view. The user can thus selectively share at least a portion of the stored video file and stored images. Here, sharing can be accomplished by transmitting information to a specific server or a specific external device using the wireless communication unit 110. In addition, the control image 510 may include a fourth control icon 514 to adjust an exit duration or exit period (seconds) of the selected view. Here, the output duration may correspond to a reproduction view (playback view) of a video file, which is selected according to a stored image. For example, when an image is selected and an output time is set to 5 seconds, a video reproduction view for 5 seconds before and after the selected image can be used. In addition, the fourth control icon 514 may be configured to allow time adjustment (or time adjustment) in response to a touch-and-pull applied thereto in one direction, but the present invention is not limited to this. According to this embodiment, when a video file is generated, a user may share the video file and images or select part of the video file and images. Now, Figures 3A and 3B are conceptual views illustrating a method of controlling image storage during video capture in accordance with an embodiment described herein. As shown in FIG. 3A, the display unit 151 can display the image 310 obtained by the image capture apparatus 121, the control icon 301 and the capture time 301d while a function video capture is performed. When a touch is applied to the image 310 or the third icon 301c, the controller 180 can store an image forming (constituting) a video file and display a first thumbnail image 411 corresponding to an image in the image unit. display 151. In addition, when a touch is applied to the third icon 301c, the controller 180 may store an image that is outputted to the display unit 151 at the applied touch time and output a second thumbnail image. 412. In response to a specific touch input simultaneously applied to the first and second thumbnail images 411 and 412, the controller 180 can store an image, outputted at the applied touch time, in the memory 170. Specific touch input may correspond to a multiple touch input simultaneously applied to different areas on the display unit 151. Thus, two images that are substantially identical to each other can be stored in association or grouped with each of the first and second thumbnail images 411 and 412. When the images are stored, the display unit 151 can convert the images. first and second thumbnail images 411 and 412 in a fourth thumbnail image 414 corresponding to the stored images, and displaying the number (2) of the associated images on the fourth thumbnail image 414. According to this embodiment, the user can store an image by being associated with a plurality of prestored images, by applying a touch input to a plurality of thumbnail images. Thus, the user does not need to copy an image after capturing a video to store the same image in a plurality of categories, and the like. As shown in Figure 3B, the controller 180 can store images at preset time intervals in the memory 170, in response to a specific touch input applied to the thumbnail image. In addition, the display unit 151 may output a fifth thumbnail image 415 corresponding to a first stored image, in response to a touch applied while the video capture function is performed. Then, in response to a touch input applied to the fifth thumbnail image 415, the controller 180 can store an output image on the display unit 151 at the applied touch time and display a thumbnail image corresponding to the stored image. In addition, as shown in Figure 3B, the controller 180 can detect a touch time applied to the fifth thumbnail image 415. Here, the touch can be a long touch input that is held for a predetermined period. Namely, the controller 180 can detect a specific period "t" during which a touch input initially applied to the fifth thumbnail image 415 is maintained. When the touch is released, the controller 180 can store a first image 311 outputted to the display unit 151 at the instant at which the initial touch input is applied, and an image outputted to the display unit. display 151 at a time when the touch input is released. The display unit 151 may output a sixth thumbnail image 416 corresponding to a second image 312 which is the most recently stored image. In addition, the controller 180 may store a third image 313 in the memory 170 and display a seventh thumbnail image 417 corresponding to the third image 313 after a lapse of the specific period "t" from the moment of touch relaxed. According to this embodiment, a touch time applied to a thumbnail image can be stored, and images displayed on the display unit 151 at an interval corresponding to time can be regularly stored. In addition, the regularly stored images can be stored in association with an image corresponding to an initial thumbnail image on which the touch was applied during the specific period. In addition, while the images are stored at the time interval, an additional image may be stored in response to a touch applied to an area of the display unit 151. In accordance with this embodiment, images can be stored at specific time intervals without additional control command from the user during video capture. Now, Figs. 4A-4D are conceptual views illustrating an image storage control method in a grouped manner when distinguishable subjects are detected. As shown in Fig. 4A, when subjects, from first to third, A, B and C are detected by the image capture apparatus 121, the display unit 151 may display guide pictures 321 indicating the subjects, from the first to the third, A, B and C, respectively, on an image 320 obtained by the image capture apparatus 121. The controller 180 can also discriminate (recognize) the subjects, from the first to the third, A, B and C as different objects, depending on faces or the like of the subjects, from the first to the third, A, B and C. In addition, the controller 180 can store an image and display a first thumbnail image 411 on the display unit 151, in response to a touch applied thereto, while the video capture function is performed. When a touch is applied to the guide image 321 corresponding to the first subject A, the controller 180 may store an image displayed on the display unit 151 at the applied touch time. Also, the image may be stored in association with the first thumbnail image 411. In addition, the image corresponding to the first thumbnail image 411 may be an image including the first subject A, or an image stored in response to an applied touch. on the guide image 321 corresponding to the first subject A. When the image 320 is stored, the controller 180 may display an eighth thumbnail image 418 corresponding to the stored image 320 on the display unit 151. When a touch is applied to the eighth thumbnail image 418, the controller 180 may display the image 320 corresponding to the eighth thumbnail image 418 on the display unit 151. The stored image 320 may correspond to an image that is captured by focusing on the first subject A. When a touch is applied to a guide image, the controller 180 can capture an image by focusing the image capture apparatus 121 on the applied touch area and store the captured image associated with other images. pre-stored according to a subject corresponding to the guide image. According to this embodiment, the user can store a plurality of images, each with a desired subject in focus, to be discriminative with respect to other images. However, even if the guide image is not displayed, when a touch is applied to a discriminated person included in the captured image, the controller 180 may control the captured image in substantially the same control manner as the use of the guide image. Namely, images may be stored in an associated manner, in response to touches applied to substantially the same object. Referring to Fig. 4B, the display unit 151 may output the guide image 321 including guide images, from first to third, 321a, 321b and 321c corresponding to the subjects, from the first to the third , A, B and C, respectively. The display unit 151 may output an image 320 detected by the image capture apparatus, and first thumbnail images 418a corresponding to prestored images. When a touch is applied to the third icon 301c after a touch is applied to the first guide image 321a, the controller 180 can store images, displayed on the display unit 151 at preset time intervals in the memory 170, being associated with a subject corresponding to the first guide image 321a. When a touch is applied to the first guide image 321a, the first guide image 321a can be transformed to indicate that the first subject A corresponding to the first guide image 321a has been selected. Here, a touch to select a guide image can be distinguished from another touch applied to the guide image to store an image. For example, the touch input applied to select the guide image in this embodiment may correspond to a long touch input. In addition, when a thumbnail image is generated in response to a touch applied to the first guide image 321a, the controller 180 can group the image displayed on the display unit 151 at the instant of touching applied in the same category as the image of the thumbnail image for storage. The controller 180 may also store the image in the same category as the prestored images, employing substantially the same method of controlling the application of a touch on the first guide image 321a at the preset time interval "t". . A third additional thumbnail image 418c may also be formed on the second 418b thumbnail image. In more detail, the controller 180 may deselect a selected guide image in response to a touch applied to the selected guide image. When the guide image is deselected, images are not stored at a specific time interval. In addition, images stored at the specific time interval in the selected state of the first guide image 321a can be captured when the view capturing apparatus is focused on the first subject A corresponding to the first guide image. 321a. In more detail, the controller 180 may select a plurality of guide images, and group and store images, in which subjects corresponding to different guide images are developed, in different categories at specific time intervals. Also, the controller 180 may not store the image when the subject is not detected by the image capture apparatus. In addition, the controller 180 may store the image again at the specific time interval when the subject is detected again by the image capture apparatus 121. In addition, the controller 180 may automatically terminate the storage function. images when the subject selected by the camera captures disappears. Thus, according to this embodiment, the user can capture images including a desired subject at a specific time interval. Referring to FIG. 4C, the display unit 151 may display the guide images 321a, 321b and 321c on the subjects, from the first to the third, A, B and C, respectively, while the 121 views is enabled. Dg. Additionally, the display unit 151 may display an image 320 obtained by the image capture apparatus 121, and a thumbnail image 418a of a captured image in response to a touch input. The controller 180 may also store the image and further display a thumbnail image corresponding to the image stored on the display unit 151, in response to a touch applied to the third icon 301c. In addition, the controller 180 can transform a shape of the third icon 301c, in response to a specific touch input applied to the third icon 301c. When a touch is applied to the first guide image 321a after the third icon 301c is transformed, the controller 180 can store an image obtained by the image capture apparatus 121 at the applied touch time associated with the image. image with a specific pre-stored image. In addition, the image can be captured when the applied touch zone is in focus. The controller 180 may also display a second thumbnail image 418b on the display unit 151 corresponding to the stored image. In addition, when a touch is applied to the third icon 301c in the transformed state of the third icon 301c, the controller 180 can store an output image on the display unit 151 at the instant of touching. . Also, the image can be stored in association with another image related to a subject included in the image. For example, when the thumbnail images correspond to the subjects, from the first to the third, A, B and C, respectively, the images related to the subject can be stored by being associated with the images corresponding to the respective thumbnail images. In addition, in an untransformed state of the third icon 301c, when a touch is applied to the guide image, a focus can be changed while the video capture is performed. Therefore, a third thumbnail image 418c of an image stored in response to a touch applied to the third icon 301c can be displayed on the display unit 151. In more detail, each of a plurality of captured images in accordance with FIG. response to the touches can be captured and stored with different areas in focus. In addition, when a specific touch input (e.g., long touch input) is applied to the third icon 301c, the controller 180 can restore the shape of the third icon 301c, and store an output image on the unit. display 151 together with related images about touch applied by applying a touch on the guide image, or store the same image by grouping into different categories. According to this embodiment, the user can store an image by applying a touch to an area including a desired person, and store the image in different categories in an associated manner. Now, FIG. 4D is a conceptual view illustrating a method of controlling the storage of an image based on a movement of a terminal main body. The mobile terminal according to this embodiment includes a detection unit for detecting movement of the terminal main body. For example, the detection unit can be implemented as at least one of an acceleration sensor, a magnetic sensor, a gravity sensor and a gyro sensor. The controller 180 can store a first image 311 displayed on the display unit 151 at a touch time applied when the touch is applied to the third icon 301c at a first posture. The display unit 151 may display a thumbnail image 411 corresponding to the first stored image 311. When a touch is applied to the third icon 301c at a second posture different from the first posture, the controller 180 may store a second image 312 displayed on the display unit 151 at the instant of touch applied in the memory 170, and display a second thumbnail image 412 corresponding to the second image 312. When a touch is applied to the third icon 301c after detecting that the terminal body is held at the first posture again, the controller 180 can store a third image 313 displayed on the display unit 151 at the instant The controller 180 may display a thumbnail image 413 corresponding to the third image 313, instead of the first thumbnail image 411. When the first posture of the terminal body is held for a preset time, the controller 180 may store the third image 313 in the memory 170 even if a touch is not applied to the third icon 301c. In accordance with this embodiment, images stored at a specific posture can be classified into the same category, which allows the user to categorize and store captured images based on a position where the terminal captures a subject and a movement of the terminal. Now, FIG. 5A is a flowchart illustrating a method of controlling the control of a function associated with an associated stored image, and FIGS. 5B through 5E are conceptual views illustrating a method of controlling the revision of a stored video file. and images according to different embodiments described herein. As shown in Figure 5A, when a video capture is complete, the display unit 151 may display a video preview screen and a plurality of thumbnail images (250). Here, the plurality of thumbnail images correspond to a plurality of stored images, respectively, and are arranged in order to be stored according to a period of time during which the video has been captured. In particular, a thumbnail image may correspond to a stored image, and the display unit 151 does not display related visual information to associated images. In addition, the preview screen corresponds to an image included in the video file, and may include a progress bar indicating a duration of reproduction of the video file, reproduction information concerning the video file and the like. In addition, the plurality of thumbnail images may be displayed adjacent to an area of the progress bar corresponding to stored times thereof, but the present invention is not limited thereto. Associated stored images can then be selected in response to a touch applied to one of the plurality of thumbnail images (S260). When the associated stored images are selected, the thumbnail image corresponding to the selected image can be displayed on the display unit 151 in a highlighted manner. In addition, the controller 180 may display partial reproduction images corresponding to the selected images on the progress bar (S270). In one example, a partial reproduction image corresponds to a video view that includes a selected image and is displayed for a preset period. Thus, the user can select a portion of the video file, which includes stored images and is played back for a specific period of time, and control the selected portion of the video file. In addition, the controller 180 may control a function using an image corresponding to a selected thumbnail image, in response to a touch applied to a control image included in the preview screen. In particular, the function may correspond to a function that uses at least one of a selected image, a partial video including the selected image, and a complete video, which will be described in detail hereinafter with reference to the accompanying drawings. As shown in Figure 5B, the controller 180 may generate a video file using the captured images in response to a touch applied to the first icon 301a. Although a preview image obtained by an activated capture device is displayed, the first icon 301a can be changed to a fifth icon 301e. When a touch is applied to the fifth icon 301e, images captured by the image capture apparatus 121 are stored and thus a video capture function may be executed. In addition, a graphic image 420 indicating the video file and the stored images can be outputted to the preview image. In particular, the graphic image 420 may include a thumbnail image corresponding to an image of the video file, and a number 421 of images stored during the execution of the video capture function. The controller 180 may also display screen information on the display unit 151 to check the video file and the stored images, in response to a touch applied to the graphic image 420. In more detail, the information of the display unit 151 screen may include a preview screen 300, a progress bar 302 indicating a reproduction time of the video file, a storage point 303 indicating a stored instant of at least one image, and thumbnail images 401 corresponding to the plurality of images. 'stored images. In addition, when the video file is generated, the display unit 151 may display a control image 510 to control the video file. As shown, the control image 510 may include a first control icon 511 for sharing a stored image, a second control icon 512 for sharing information about a selected view of the video, and a third control icon 513 to share a video of a selected view. Thus, the user can selectively share at least a portion of the video file and stored images. Here, sharing can be accomplished by transmitting information to a specific server or specific external device using the wireless communication unit 110. The control image 510 may also include a fourth control icon 514 to adjust a duration output (seconds) of the selected view. Referring to Fig. 5C, when a touch is applied to a first thumbnail image 401a among the plurality of thumbnail images 401, the controller 180 may adjust a reproduction time, depending on a storage point 303 and a preset exit duration of the first thumbnail image 401a. The reproduction time may include times at which images corresponding to the thumbnail images 401 are displayed. For example, the reproduction time may be set to a duration of a specific time before the image time output at a specific time after the output image time. In more detail, when the output time is set to 5 seconds, the playback time can be set to a duration of 2 seconds, before the image is output, at 3 seconds, after the image is output. In addition, the display unit 151 may display a partial reproduction image 304 indicating the reproduction time on the progress bar 302. The controller 180 may also move an indicator, which indicates a reproduction time on the progress bar 302, to an initial area of the partial reproduction image 304, and displaying an image on the display unit 151 corresponding to a reproduction area where the indicator is located on the progress bar 302. However, the adjustment reference of the reproduction time is not limited to this. For example, the playback time can be set to 5 seconds after the stored image is displayed, or 5 seconds before the stored image is displayed. The user can also set a playback time based on a stored image, in order to partially display a video file or to share a view of the video file corresponding to the set playback time. Referring to FIGS. 5C and 5D, when a first touch is applied to the thumbnail image 401a, the controller 180 may display a partial reproduction image corresponding to the thumbnail image 401a on the display unit 151. In FIG. furthermore, when a second touch is applied to the thumbnail image 401a, the controller 180 can completely display an image 401a 'on the display unit 151, which corresponds to the thumbnail image 401a and is stored in the memory 170. For example, the second touch may correspond to a long touch input or a double touch input applied to the thumbnail image 401a for a specific period. The controller 180 may then display screen information including the preview screen 300 and the control image 510 on the display unit 151 when a touch is applied to the stored image 401a '. According to this embodiment, the user can check a reproduction duration corresponding to a stored image or quickly check the stored image, applying a different type of touch on a thumbnail image. Referring to Fig. 5E, when the user selects the first thumbnail image 401a from the plurality of thumbnail images 401, the controller 180 can transform the first thumbnail image selected 401a, and also transform a second thumbnail image 401b of a image stored in association with the image of the first 401a thumbnail image. For example, when images corresponding to the first and second thumbnail images 401a and 401b are associatedly stored, the controller 180 can transform thumbnail images 401a and 401b in response to a touch applied to the first thumbnail image 401a. Namely, images stored in the same category can be selected. In addition, the other thumbnail images 402 that are stored in different categories of the image of the first thumbnail image 401a can be transformed. For example, the controller 180 may highlight the first and second thumbnail images 401a and 401b or display in an inactivated state other thumbnail images with the exception of the first and second thumbnail images 401a and 401b. Further, the controller 180 may display the partial reproduction images 304 corresponding to the first and second thumbnail images 401a and 401b on the progress bar 302, in response to a touch applied to the first thumbnail image 401a. When a touch is applied to the control image 510 after a portion of the thumbnail images is selected, a stored image and information about a selected view of a video or video of a selected view can be shared. Now, Figures 6A and 6B are conceptual views illustrating a method of controlling the individual selection of discriminatively stored images. As shown in FIG. 6A, the controller 180 can select the first and second thumbnail images 401a and 401b in response to a touch applied to the first thumbnail image 401a among the plurality of thumbnail images 401. The controller 180 can then displaying the partial reproduction images 304 corresponding to the first and second thumbnail images 401a and 401b on the progress bar 302. The display unit 151 can also display the first and second thumbnail images 401a and 401b and the other thumbnail images 402 of different ways. In addition, the controller 180 may select a third thumbnail image 402a, in response to a touch applied to the third thumbnail image 402a of the other thumbnail images 402. Also, the controller 180 may transform the third thumbnail image 402a, and further display a partial reproduction image 304 of an image corresponding to the third thumbnail image 402a. In addition, the controller 180 may select another thumbnail image, which belongs to the same category as the first thumbnail image 401a, in response to a touch applied to the first thumbnail image 401a, and further select a third thumbnail image 401c that belongs to to a different category. In this case, images that are grouped into different categories, view information, and a video view can be shared. Referring to Fig. 6B, the controller 180 may display a fifth control icon 415 to individually select thumbnail images on the display unit 151. The controller 180 may then select some of the plurality of thumbnail images 401 when a touch is applied to the fifth control icon 415. The fifth control icon 415 can also be transformed into a response to a touch applied thereto. When another touch is applied to some of the plurality of thumbnail images after the touch is applied to the fifth control icon 415, some of the plurality of thumbnail images may be selected. In this case, although a thumbnail image is selected, another thumbnail image belonging to the same category may not be selected. That is, after touch is applied to the fifth control icon 415, each thumbnail image can be individually selected by touching it. In addition, the controller 180 may display the partial reproduction image 304 corresponding to the selected thumbnail image. In more detail, when a touch is applied again to the fifth control icon 415, the controller 180 can display thumbnail images stored in the same category for immediate selection. Now, Figs. 7A-7C are conceptual views illustrating a method of controlling the revision of a classification (categorization) of images according to different embodiments. As shown in Figs. 5A and 7A-7C, thumbnail image positions and control image positions may differ according to a plurality of stored images. However, the thumbnail images and the control image displayed on the display unit may have substantially the same function. Further, the control method illustrated in Figs. 7A-7C can also be applied substantially identically to the screen information including the thumbnail images and the control image illustrated in Fig. 5A. Referring to Fig. 7A, the controller 180 may terminate the video capture function when a touch is applied to the first icon 301a, and the first icon 301a may be converted to the fifth icon 301e. In addition, the graphic image indicating the video file and the stored images can be displayed on a preview image obtained by the image capture apparatus. The graphic image may include a thumbnail image corresponding to an image of the video file, and a number 421 of images stored during the execution of the video capture function. In addition, the controller 180 may display screen information on the display unit to check the video file and the stored images, in response to a touch applied to the graphic image 420. As previously described, Screen information may include a progress bar 302 indicating a video file playback time, a storage point 303 indicating a stored time of at least one image, and thumbnail images 401 corresponding to the plurality of stored images. In addition, the controller 180 may display, on an area of the display unit 151, a control region 430 which includes a plurality of thumbnail images 401, and first and second control icons 431 and 432. The plurality thumbnail images 440 may be arranged in a direction of intersection with the progress bar 302. However, the plurality of thumbnail images 440 are preferably arranged in the order of stored instants while the video capture function is executed. In addition, an exit time (seconds) of a selected view may be adjusted in response to a touch applied to the first control icon 431. For example, the first control icon 431 may include a number to receive a touch input. slip and indicates a changed output duration. The second control icon 432 may also receive a touch to store revised images when the plurality of thumbnail images are revised. A method of controlling the revision of a categorization of the plurality of thumbnail images will be described in detail below with reference to Figs. 8A-8D. As shown in Figures 7B and 7C, the controller 180 can completely display an image corresponding to a thumbnail image 440 'among the plurality of thumbnail images 440, in response to a first touch applied to the thumbnail image 440' . For example, the first touch can be a long touch input. Also, when the first touch is reapplied while the image is displayed on the display unit 151, screen information including the preview screen and the first and second control icons may be displayed. In addition, when a second touch is applied to a first thumbnail image 441a among the plurality of thumbnail images 440, the controller 180 can select the first thumbnail image 441a. When the first thumbnail image 441a is selected, a second thumbnail image 441b belonging to the same category as the first thumbnail image 441a can also be selected. The display unit 151 can then output the first and second thumbnail images 441a and 441b and the other thumbnail images 442 in different ways or in different shapes to be distinguishable from each other. In addition, the controller 180 may display partial reproduction images 304 on the display unit 151 corresponding to the first and second thumbnail images 441a and 441b '. In more detail, when a touch is applied to the second control icon 432 after at least one thumbnail image is selected, the controller 180 may display screen information on the display unit 151 to control the display. image sharing and the like. Now, Figures 8A to 8D are conceptual views illustrating a method for controlling the modification of pre-stored image categories. As shown in Fig. 5A and Figs. 8A-8D, thumbnail image positions and control image positions may differ according to a plurality of stored images. However, the thumbnail images and the control image outputted to the display unit 151 may have substantially the same function. The control method illustrated in Figs. 7A-7C can also be applied substantially identically to the screen information including the thumbnail images and the control image illustrated in Fig. 5A. As shown in FIG. 8A, when a touch is applied to the first thumbnail image 441a among the plurality of thumbnail images, the controller 180 can select the second thumbnail image 441b classified in the same category as the first thumbnail image 441a. . Images corresponding to the first and second thumbnail images 441a and 441b may belong to a first category 441. When the plurality of thumbnail images 440 are arranged in a first direction, the first and second thumbnail images 441a and 441b selected in response to touch can move in a second direction that intersects the first direction. Also, the other thumbnail images can move in a direction opposite to the second direction. As a result, the user can easily distinguish between selected thumbnail images and unselected thumbnail images of each other. When a touch is applied to one of the other thumbnail images with the exception of the first and second thumbnail images 441a and 441b ', the first and second thumbnail images 441a and 441b may be deselected, and the other thumbnail images may be selected . Images corresponding to other thumbnail images may also be grouped in the same category in an associated manner. In addition, the controller 180 may display partial reproduction images corresponding to the selected thumbnail images on the display unit 151, in response to touches applied to the thumbnail images. The controller 180 may also deselect the selected thumbnail image when a specific touch input is applied to the selected thumbnail image. For example, the specific touch input can be a long touch input. When the selected thumbnail image is deselected, the controller 180 may stop displaying the partial reproduction image 304 and switch the second control icon 432 to an inactive state. Referring to Fig. 8B, when the first and second thumbnail images 441a and 441b are selected, the controller 180 may display the first and second thumbnail images selected 441a and 441b in a moved state in the second direction. When a sliding touch input is applied in the second direction on a third thumbnail image 442a corresponding to an image stored in a different category of the category of the first and second thumbnail images 441a and 441b ', the image corresponding to the third image vignette 442a can be stored by being associated with the first and second thumbnail images 441a and 441b '. In this case, the associated state of the third thumbnail image 442a to other images may be released. Namely, the category in which the image corresponding to the third thumbnail image 442a is stored can be changed. Therefore, the third thumbnail image 442a can move in the first direction to be displayed in parallel with the first thumbnail image 441a. Also, a partial reproduction image corresponding to the third thumbnail image 442a may be displayed. In addition, when a touch is applied to the second thumbnail image 441b in a direction opposite to the second direction, the associated state between the image corresponding to the second thumbnail image 441b and the image corresponding to the first thumbnail image 441a can be released. Namely, the category in which the image of the second thumbnail image 441a is stored can be changed. The image of the second thumbnail image 441b can also be stored in association with the other thumbnail images 442, but the present invention is not limited thereto. Namely, the controller 180 can change a category to store an image, in response to a touch applied to the thumbnail image in the second or opposite direction to the second direction, and change the output position and shape of the image. thumbnail image on the display unit 151 for notifying a category change. According to this embodiment, the user can recognize the associated state among the plurality of stored images, and more easily change the associated stored state (category). Hereinafter, a method of controlling the adjustment of the reproduction time will be described with reference to FIG. 8C. In particular, the display unit 151 may output partial reproduction images 304 corresponding to the first and second selected thumbnail images 441a and 441b '. Partial reproduction images 304 corresponding to the first and second thumbnail images 441 and 441b ', respectively, may have substantially the same reproduction duration. To notify the reproduction time, the partial reproduction image 304 may be displayed to have a first width W1 in an extension direction of the progress bar 302. When the reproduction time is prolonged in response to a touch applied to the first control icon 431, the controller 180 can change the width of the partial reproduction image 304 to a second width W2 larger than the first width W1. Here, the partial reproduction images 304 corresponding to the first and second thumbnail images 441a and 441b, respectively, which are stored in the associated state, can also change. However, the present invention is not limited to this. According to this embodiment, the user can easily change a reproduction time of a view including images and set the same time for associated images. Referring to Fig. 8D, after selecting the thumbnail images and setting the reproduction time, when a touch is applied to the second control icon 432, the controller 180 may display a sharing window 450 to share the adjusted images. and the like. The sharing window 450 may be displayed as a pop-up window, and include a plurality of icons for receiving touches to select data to be shared. This allows the user to review data to be shared while checking a plurality of thumbnail images. Now, FIGS. 9A through 9C are conceptual views illustrating a method of controlling the supply of video files and stored images. As shown in Figure 9A, the controller 180 may display a graphics image 510, which includes a video file 511 and images stored together, on a run screen 500 when a specific application is executed. The graphic image 510 may also include dates at which the video file and images are generated, and images indicating the video file 511 and the images 512. The controller 180 may then display screen information including the video file 511 and the images 512, in response to a touch applied to the graphic image 510. The display unit 151 may also display a preview screen formed by an image included in the video file 511 on a first region 520, and images stored in conjunction with the video file 511 on a second region 530. In addition, the second region 530 may output a plurality of discriminantly associated stored images. That is, the second region 530 can be divided into category regions, from first to third, 531, 532 and 533. Associated image thumbnail images can be outputted on each category region. Also, the screen information may further include a control window 540 for sharing the video file and the images. When a thumbnail image 531a is selected on the second region 530, the controller 180 may display an image 520 'corresponding to the selected thumbnail image 531a on the preview screen. Also, a partial reproduction image 304 corresponding to the thumbnail image 531a may be outputted on a progress bar. This allows the user to recognize when each image has been stored, even when associated images are output. In addition, the controller 180 may change a category in response to a continuous touch input applied to the thumbnail image 531a. Namely, the controller 180 can move the thumbnail image 531a of the first category region 531 to the second category region 532, in response to a sliding touch input applied to the thumbnail image 531a. As a result, the user can easily change the associated state of the images while the images are output for each category. Referring to Fig. 9B, the display unit 151 may display a selection icon 534 on each category region. The controller 180 may then select a plurality of images associated with each other in response to a touch applied to the selection icon 533. For example, a plurality of images included in the second category region 532 may be selected and partial reproduction images 304 corresponding to the selected images can be outputted. In more detail, the controller 180 can share images included in the selected category region, and this allows the user to easily select the plurality of associated images. Referring to Fig. 9C, the controller 180 may select a plurality of images, regardless of categories, in response to a touch applied to the control window 540. In particular, the control window 540 may include an icon for selecting a plurality of images individually. After selecting the icon, images can be selected in response to touches applied to the thumbnail images. The display unit 151 may also sequentially output partial reproduction images 304 corresponding to the images, which are selected in response to the touches on the progress bar 302. In accordance with this embodiment, the user can immediately change and storing a preview image obtained by the real-time image capture apparatus. The present invention can be implemented as computer-readable codes in a stored program medium. The computer readable medium may include any type of recording device, each storing data readable by a computer system. Examples of such computer readable media may include a dm disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage element and the like. Also, the computer readable medium can also be implemented as a carrier waveform (eg, transmission over the Internet). The computer may include the controller 180 of the terminal. Therefore, it is also to be understood that the embodiments described above are not limited by any of the details of the foregoing description unless otherwise indicated, but rather should be interpreted generally within the scope of the present invention as it may be. is defined by the protection sought, and therefore any changes and modifications that are within the bounds and limits of the protection sought are intended to be encompassed. Of course, the invention is not limited to the embodiments described above and shown, from which we can provide other modes and other embodiments, without departing from the scope of the invention. .
权利要求:
Claims (15) [1" id="c-fr-0001] A mobile terminal, comprising: a view capturing apparatus (121); a touch screen (151) configured to display images (310) captured by the image capture apparatus (121); a memory (170); and a controller (180) configured to: in response to a first touch applied to the touch screen (151) at a first instant, storing a first image in the memory (170), wherein the first image corresponds to that displayed on the touch screen at the instant of the first touch, displaying a first thumbnail image (411) corresponding to the first image on the touch screen, and in response to a second touch applied to a specific area of the touch screen (151). ) displaying the images and the first thumbnail image (411) at a second time, storing in the memory (170) a second image displayed on the touch screen (151) at the second time, and grouping the second image with the first image in the memory (170). [2" id="c-fr-0002] The mobile terminal of claim 1, wherein the controller is further configured to store the first and second images in the memory (170) when the specific area of the touch screen is the one displaying the first thumbnail image (411). [3" id="c-fr-0003] The mobile terminal of claim 2, wherein the touch screen (151) is further configured to change the first thumbnail image (411) to a second thumbnail image (412) corresponding to the second image in response to touching. applied to the first thumbnail image (411), and wherein the second thumbnail image (412) includes visual information indicative of a number of grouped images. [4" id="c-fr-0004] The mobile terminal of claim 2, wherein the controller (180) is further configured to, in response to a multiple touch input applied to the first and second thumbnail images displayed on the touch screen (151), store the first image (411) with an image displayed on the touch screen (151) while the multi-touch is applied and storing the second image (412) with the image displayed on the touch screen (151) while the multiple touch is applied. [5" id="c-fr-0005] The mobile terminal according to any one of claims 1 to 4, further comprising: a sensor configured to detect a posture of the mobile terminal, wherein, if the first touch is applied to the touch screen (151) with the terminal mobile in a first posture and the second touch is applied to the touch screen with the mobile terminal in a second posture, the controller (180) is further configured to group the second image (411) with the first image (412) into responding to a third touch applied to the touch screen (151) when the mobile terminal is put back into the first posture from the second posture. [6" id="c-fr-0006] The mobile terminal of any one of claims 1 to 5, wherein the controller (180) is further configured to: display a guide image (321a) indicating a detected subject on the images displayed on the touch screen (151). ), and storing captured images in the memory (170) by grouping the stored images in response to a touch applied to the guide image. [7" id="c-fr-0007] The mobile terminal of claim 6, wherein the controller (180) is further configured to capture an image with the subject in focus in response to the touch applied to the guide image. [8" id="c-fr-0008] The mobile terminal of any one of claims 1 to 7, wherein the controller (180) is further configured to: display an icon on the touch screen (151) for storing an image during video capture, and storing images in the memory (170) at an interval of a specific period in response to a touch applied to the icon during the specific period. [9" id="c-fr-0009] The mobile terminal of claim 8, wherein the controller (180) is further configured to: display a plurality of thumbnail images on the touch screen (151), and store images corresponding to the respective thumbnail images by grouping the images captured at the specific time interval. [10" id="c-fr-0010] The mobile terminal of any one of claims 1 to 9, wherein the controller (180) is further configured to: generate a video file using the captured images when a video capture is complete, display a progress bar (302) of the video file on the touch screen (151), and a plurality of thumbnail images corresponding to the plurality of captured images, and displaying storage points indicating stored times of the plurality of images, respectively, on the progress bar (302). [11" id="c-fr-0011] The mobile terminal of claim 10, wherein the controller (180) is further configured to display on the progress bar (302) a partial reproduction image (304) indicating a reproduction time of a video reproduction view including the plurality of images in response to a selected thumbnail image. [12" id="c-fr-0012] The mobile terminal of claim 10, wherein the controller (180) is further configured to: arrange the plurality of thumbnail images on the touch screen (151), and display at least one thumbnail image grouped by moving the at least one thumbnail image grouped in response to a touch applied to a thumbnail image. [13" id="c-fr-0013] The mobile terminal of claim 12, wherein the controller (180) is further configured to display a share window (450) for sharing at least a portion of grouped images and the video file with a specific server or device external specific. [14" id="c-fr-0014] The mobile terminal of any one of claims 1 to 13, wherein the controller (180) is further configured to, in response to the completed video capture, displaying grouped images in a plurality of category regions, the regions class (531, 532, 533) being discriminated from one another. [15" id="c-fr-0015] A method of controlling a mobile terminal, the method comprising: displaying images captured by an image capture apparatus on a touch screen (151) of the mobile terminal; in response to a first touch applied to the touch screen (151) at a first instant, storing a first image (310) in a memory (170), wherein the first image corresponds to that displayed on the screen touch (151) at the instant of the first touch; displaying a first thumbnail image (411) corresponding to the first image on the touch screen (151); and in response to a second touch applied to a specific area of the touch screen (151) displaying the images and the first thumbnail image (411) at a second time, storing, in the memory (170), a second image displayed on the touch screen at the second instant, and grouping the second image with the first image in the memory (170).
类似技术:
公开号 | 公开日 | 专利标题 FR3021133B1|2019-08-30|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3022368B1|2019-06-21|WATCH-TYPE TERMINAL AND CONTROL METHOD THEREOF FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3031601B1|2019-08-30|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3022367A1|2015-12-18| US10372322B2|2019-08-06|Mobile terminal and method for controlling the same US10564675B2|2020-02-18|Mobile terminal and control method therefor FR3021766A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021425A1|2015-11-27| FR3026201A1|2016-03-25| FR3022649A1|2015-12-25| FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3024786A1|2016-02-12|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME FR3039673A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021767A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME FR3019665A1|2015-10-09| FR3021136A1|2015-11-20|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3046470B1|2019-11-08|MOBILE TERMINAL FR3040221A1|2017-02-24| FR3021485A1|2015-11-27|MOBILE DEVICE AND METHOD OF CONTROLLING THE SAME FR3042084B1|2019-11-08|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3041785A1|2017-03-31|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3022648A1|2015-12-25|
同族专利:
公开号 | 公开日 CN106686206A|2017-05-17| KR20170054924A|2017-05-18| EP3168727B1|2020-10-14| KR101832966B1|2018-02-28| US20170134605A1|2017-05-11| CN106686206B|2020-06-09| FR3043478B1|2019-12-20| EP3168727A2|2017-05-17| EP3758361A1|2020-12-30| EP3168727A3|2017-07-12| US9860411B2|2018-01-02|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 CN100551006C|2004-10-13|2009-10-14|索尼株式会社|Transcriber, imaging device and screen display method| JP5083090B2|2007-09-18|2012-11-28|ソニー株式会社|Display control apparatus, imaging apparatus, display control method, and program| JP2009278310A|2008-05-14|2009-11-26|Panasonic Corp|Apparatus controller| JP2010263615A|2009-04-08|2010-11-18|Sony Corp|Information processing device, information processing method, playback device, playback method, and recording medium| US8239783B2|2009-11-19|2012-08-07|Microsoft Corporation|Integrated viewfinder and digital media| CN102118651B|2010-01-06|2014-03-12|Lg电子株式会社|Display device and method of displaying content on display device| DE112011106063B4|2011-12-30|2021-06-24|Intel Corporation|User interfaces for electronic devices| ES2579079T3|2011-12-31|2016-08-04|Nokia Technologies Oy|Provocation of item display| KR101960306B1|2012-07-13|2019-03-20|엘지전자 주식회사|Mobile terminal and controlling method thereof| KR102064963B1|2013-04-18|2020-01-10|삼성전자주식회사|Method and apparatus for video call in communication system| US9307112B2|2013-05-31|2016-04-05|Apple Inc.|Identifying dominant and non-dominant images in a burst mode capture| KR102080746B1|2013-07-12|2020-02-24|엘지전자 주식회사|Mobile terminal and control method thereof| US9804760B2|2013-08-22|2017-10-31|Apple Inc.|Scrollable in-line camera for capturing and sharing content| EP2851900B1|2013-09-18|2017-08-23|Nxp B.V.|Media content real time analysis and semi-automated summarization for capturing a fleeting event.| KR20150094355A|2014-02-11|2015-08-19|엘지전자 주식회사|Mobile terminal and controlling method thereof| US20150312482A1|2014-04-28|2015-10-29|Youlapse Oy|User input technique for adjusting successive image capturing| US9860451B2|2015-06-07|2018-01-02|Apple Inc.|Devices and methods for capturing and interacting with enhanced digital images| US10158806B2|2015-09-02|2018-12-18|Thumbroll Llc|Camera system and method for aligning images and presenting a series of aligned images|JP6771192B2|2015-11-24|2020-10-21|パナソニックIpマネジメント株式会社|Imaging device| KR20170096849A|2016-02-17|2017-08-25|삼성전자주식회사|Electronic apparatus and control method for sharing image thereof| US20190174069A1|2016-03-18|2019-06-06|Kenneth L. Poindexter, JR.|System and Method for Autonomously Recording a Visual Media| FR3052008A1|2016-05-27|2017-12-01|Orange|NAVIGATION IN A PLURALITY OF VIDEO CONTENTS| JP6932963B2|2017-03-22|2021-09-08|コニカミノルタ株式会社|Radiation image display device, radiation imaging system and program| CN107783709B|2017-10-20|2021-02-09|维沃移动通信有限公司|Image viewing method and mobile terminal| CN108089788B|2017-12-19|2019-12-06|维沃移动通信有限公司|Thumbnail display control method and mobile terminal| KR20190101693A|2018-02-23|2019-09-02|삼성전자주식회사|Electronic device displaying a interface for editing video data and method for controlling thereof| US20190320124A1|2018-04-13|2019-10-17|Sharp Kabushiki Kaisha|Electronic device, control device, control method, and control program| KR102223554B1|2018-09-20|2021-03-05|주식회사 인에이블와우|Terminal, method for contrlling thereof and recording medium on which a program for implemeting the method| CN109614019A|2018-11-29|2019-04-12|阿里巴巴集团控股有限公司|A kind of processing method of image, device and equipment| US11200740B1|2020-09-04|2021-12-14|Amazon Technolgoies, Inc.|Systems for generating image-based user interfaces|
法律状态:
2017-05-30| PLFP| Fee payment|Year of fee payment: 2 | 2017-06-30| PLSC| Search report ready|Effective date: 20170630 | 2018-05-29| PLFP| Fee payment|Year of fee payment: 3 | 2019-03-27| PLFP| Fee payment|Year of fee payment: 4 | 2020-04-24| PLFP| Fee payment|Year of fee payment: 5 | 2021-04-09| PLFP| Fee payment|Year of fee payment: 6 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 KR1020150157692A|KR101832966B1|2015-11-10|2015-11-10|Mobile terminal and method of controlling the same| KR1020150157692|2015-11-10| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|