![]() MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
专利摘要:
The present invention provides a mobile terminal (100) comprising a display unit (151) arranged to display screen information corresponding to a specific function, and receive a touch to control the specific function, and a controller (180) designed to determine whether or not an error has occurred from the specific function, based on changes of the screen information and error information in the screen information during the control of the specific function, wherein the controller controls the display unit to display a guidance image (603) for searching for error data associated with the error when it is found that the error has occurred. 公开号:FR3040096A1 申请号:FR1655341 申请日:2016-06-10 公开日:2017-02-17 发明作者:Jaehwa Lee;Hyerim Ku 申请人:LG Electronics Inc; IPC主号:
专利说明:
MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME The present invention relates to a mobile terminal capable of providing guidance information when it determines that an error has occurred. A mobile terminal is a device that includes a battery and a display unit, which displays information on the display unit by the energy provided by the battery, and which can be worn by a user. The mobile terminal includes a device that records and reproduces videos and a device that displays graphical user interfaces (GUIs), examples of mobile termihal including a notebook, a mobile phone, glasses and a watch capable of displaying images. screen information, a game machine and the like. Having become multifunctional, a mobile terminal may be authorized to capture still images or moving pictures, to play music or video files, to play games, to receive broadcasts and the like, so as to be implemented as an integrated media player. Significant efforts are needed not only to modify and improve the structural components that constitute a mobile terminal, but also to improve the software to support and improve the functions of the terminal. As a result, it is difficult for a user to know how to correctly use these various functions and to correctly use the desired function, because of an improper use method or a wrong system. Moreover, mobile device manufacturers do not provide detailed instructions for each function, and the user is therefore unable to use the function correctly. Therefore, one aspect of the detailed description is the prompt provision of guidance information for solving an overhead error in use. To provide these and other advantages, and in accordance with the objectives of this Memo as implemented and generally described herein, there is provided a mobile terminal including a display unit adapted to display information of screen corresponding to a specific function and receive a touch to control the specific function, and ynxortfxôJeur designed to determine whether or not an error occurred from the specific function, based on changes of screen information and information of error in the screen information during the control of the specific function, the controller controlling the display unit to display a guidance image for searching error data associated with the error when proves that the error has occurred. In an exemplary embodiment, the controller may determine that the error occurred when the screen information is repeatedly displayed or when a notification window reporting the error information is displayed on the unit. display. This can allow the user to resolve the error immediately without even running a specific application. In an exemplary embodiment, a text image extracted from the notification window may be defined as a search term to search for error data in response to applying a continuous touch to the notification window and to the guide image, or text extracted from a capture image of the screen image can be defined as a search term. This may result in the deletion of the step of entering an appropriate search word to search for error data. In an exemplary embodiment, the controller may perform a video call function with an external device and transmit the screen information to the external device when it is found that the error has occurred. This may allow the user to seek the help of another user in real time when the error occurs. In an exemplary embodiment, when the user repetitively displays the same screen information, this may be determined as an error and a method for faster execution of an application to resolve the error may be provided. . Furthermore, when it turns out that the screen information displayed repetitively is not the result of the user's desire, the exit of an icon may be restricted so that it is easier for the user to control the screen information. In addition, an image or video from which an error has occurred can be stored and text associated with error information can be retrieved and searched from the image or video, thereby providing more correct error data. Other fields of application of the present application will emerge more clearly on reading the detailed description given below. However, it should be understood that the detailed description and specific examples, although indicating preferred embodiments of the invention, are provided for illustrative purposes only, as will be understood by those skilled in the art upon reading the Detailed description, it is possible to consider various variants and modifications thereof while remaining within the spirit and scope of the invention. The accompanying drawings, which have been included to provide a better understanding of the invention and form an integral part of this memo, illustrate exemplary embodiments and serve, together with the description, to explain the principles of the invention. In the drawings: Fig. 1A is a block diagram of a mobile terminal according to an exemplary embodiment of the present invention; Figures IB and IC are conceptual views that show an example of the mobile terminal, seen from different directions; Fig. 2A is a flow chart showing a method of controlling a mobile terminal according to an exemplary embodiment of the present invention; Figures 2B to 2E are conceptual views showing the control method of Figure 2A; Figs. 3A to 3D are conceptual views showing a control method for providing guidance information upon occurrence of an error according to various exemplary embodiments of the present invention; Figs. 4A-4C are conceptual views showing a method of providing error data using a capture image; Fig. 5 is a conceptual view showing a control method for executing a specific application according to another exemplary embodiment of the present invention; Figs. 6A-6D are conceptual views showing a control method for providing error data according to various exemplary embodiments of the present invention; Figs. 7A and 7B are conceptual views showing a control method for providing guidance information; and Figs. 8A-8C are conceptual views showing a control method for recording error information in real time. We will now proceed to a detailed description of the invention, according to the exemplary embodiments disclosed herein and in light of the accompanying drawings. To abbreviate the description made with reference to the drawings, identical or equivalent components may be indicated by identical or similar reference numbers, and we will not repeat their description. In general, a suffix such as "module" and "unit" may be used to designate elements or components. The purpose of using such a suffix in this document is to facilitate the description of the booklet, since the suffix itself is not intended to constitute a special meaning or function. The accompanying drawings make it easy to understand various technical features and, as will be appreciated, the embodiments presented herein are in no way limited by the accompanying drawings. As such, the present invention should be understood to encompass all modifications, equivalents and substitutes in addition to those specifically indicated in the accompanying drawings. The mobile terminals presented here can be implemented through various terminals of different types. Examples of these terminals include cell phones, smartphones or "smartphones", user devices, laptops, digital broadcast terminals, personal digital assistants (PDAs), portable media players, browsers , digital slates, digital tablets, ultrabooks, worn devices (eg, smart watches, smart glasses, virtual reality headsets) and the like. As an example of strictly non-limiting nature, we will deepen our description by referring to particular types of mobile terminals. These explanations, however, also apply to other types of terminals, such as the types mentioned above. In addition, these explanations may also apply to fixed terminals such as digital TVs, desktops and the like. Referring now to FIGS. 1A-1C, where FIG. 1A is a block diagram of a mobile terminal according to the present invention and FIGS. 1B and 1C are conceptual views of a first example of the mobile terminal, seen from different directions. The mobile terminal 100 may be shown having components such as a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an interface unit 160, a memory 170 , a controller 180 and a power supply unit 190. As will be understood, it is not mandatory to implement all the components shown, and the components used may be more or less numerous. If we go into the details, the wireless communication unit 110 may generally comprise one or more modules that allow communications such as wireless communications between the mobile terminal 100 and a wireless communication system, communications between the terminal mobile 100 and another mobile terminal, communications between the mobile terminal 100 and an external server. In addition, the wireless communication unit 110 may generally comprise one or more modules that connect the mobile terminal 100 to one or more networks. The wireless communication unit 110 may comprise one or more of the following: a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114, and a location information module 115. The input unit 120 may comprise a camera 121 or an image input unit which makes it possible to obtain images or video, a microphone 122 which is a type of device audio input device for inputting an audio signal; and a user input unit 123 (for example, a touch key, a mechanical key, and the like) that allows a user to enter information. Data (e.g., audio, video, image, and the like) can be obtained by the input unit 120 and can be analyzed and processed according to user commands. The detection unit 140 may generally be implemented using one or more sensors designed to detect the internal information of the mobile terminal, the ambient environment of the mobile terminal, user information and the like. For example, the detection unit 140 may comprise at least one of the following sensors: a proximity sensor 141, a lighting sensor 142, a touch sensor, an acceleration sensor, a magnetic sensor, an accelerometer, a sensor a gyroscope, a motion sensor, an RGB sensor, an infrared (IR) sensor, a fingerprint sensor, an ultrasonic sensor, an optical sensor (for example, the camera 121), a microphone 122, a battery meter, a room sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detector, a thermal sensor and a gas sensor, among others), and a chemical sensor (for example, an electronic nose, a medical sensor, a biometric sensor and the like). The mobile terminal described herein may be adapted to use the information obtained from one or more sensors of the detection unit 140, and combinations thereof. The output unit 150 may generally be designed to provide various types of information, including audio, video, tactile and similar outputs. The output unit 150 may be shown having a display unit 151, an audio output module 152, a haptic module 153 and an optical output module 154. The display unit 151 may have a layered structure or an integrated structure with a touch sensor that facilitates the use of a touch screen. The touch screen may constitute an output interface between the mobile terminal 100 and a user, and it may also function as a user input unit 123 which constitutes the input interface between the mobile terminal 100 and the user. user. The interface unit 160 serves as an interface with different types of external devices that can be coupled to the mobile terminal 100. The interface unit 160 may include, for example, one or more of the following: wired or non-wired ports wire, external power ports, wired or wireless data ports, memory card ports, connection ports to a device with an identification module, input / output ports (I / O) audio, video I / O ports, headphone ports, and the like. In some cases, the mobile terminal 100 may perform various control functions associated with a connected external device, when the external device is connected to the interface unit 160. The memory 170 is generally implemented in such a way as to store data enabling the various functions or characteristics of the mobile terminal 100 to be supported. For example, the memory 170 may be designed to store application programs executed in the mobile terminal 100. data or instructions relating to the operations of the mobile terminal 100 and the like. Some of these application programs can be downloaded from an external server via wireless communication. Other application programs may be installed in the mobile terminal 100 during its manufacture or its transport, which is generally the case for the basic functions of the mobile terminal 100 (for example, the reception of a call, the establishing a call, receiving a message, sending a message and the like). It is common for application programs to be stored in the memory 170, installed on the mobile terminal 100 and executed by the controller 180 to perform an operation (or function) for the mobile terminal 100. The controller 180 generally has the function of controlling the overall operation of the mobile terminal 100, in addition to the operations associated with the application programs. The controller 180 can provide or process user-adapted information or functions by processing the signals, data, information, and the like that are input or output of the various aforementioned components, or by activating application programs stored in the application. memory 170. In addition, the controller 180 controls all or part of the components shown in FIG. 1A according to the execution of an application program that has been stored in the memory 170. In addition, the controller 180 can control at least two of these included components. in the mobile terminal to activate the application program. The power supply unit 190 may be designed to receive external energy or provide internal energy to provide the energy required for the operation of the elements and components included in the mobile terminal 100. The power supply unit 190 may comprise a battery, the battery can be designed to be integrated into the body of the terminal or designed to be separated from the body of the terminal. At least a portion of the components may operate cooperatively to implement an operation, control, or method of controlling a mobile terminal in accordance with various embodiments described herein. In addition, the operation, command or control method of the mobile terminal can be implemented on the mobile terminal by activation of at least one application program stored in the memory 170. Hereinafter, the aforementioned components will be described in more detail with reference to FIG. 1A, before describing various embodiments implemented through the mobile terminal 100. Firstly, with respect to the wireless communication unit 110, the broadcast receiving module 111 is generally designed to receive a broadcast signal and / or information associated with broadcasting from an external entity of management of broadcasts through a broadcasting channel. The broadcast channel may comprise a satellite link, a terrestrial link, or both. In some embodiments, two or more broadcast receiving modules 111 may be used to facilitate the simultaneous reception of two or more broadcast channels, or to allow switching between different broadcast channels. The mobile communication module 112 may transmit and / or receive radio signals from or to one or more network entities. Typical examples of a network entity include a base station, an external mobile terminal, a server, and the like. These network entities are part of a mobile communication network that has been established in accordance with technical standards or communication methods for mobile communications (eg the global mobile communications system (GSM), code (CDMA), CDMA2000 (code division multiple access 2000), broadband CDMA (WCDMA), high-speed downlink packet access (HSDPA), uplink packet access flow (HSUPA), long-term evolution (LTE), advanced long-term evolution (LTE-A) and the like). Examples of radio signals include audio call signals, video call signals (telephony), or various data formats that support sms or multimedia message communications. The wireless Internet module 113 is designed to facilitate wireless access to the Internet. This module can be internally or externally coupled to the mobile terminal 100. The wireless Internet module 113 can transmit and / or receive radio signals via communication networks using wireless Internet technologies. Examples of this wireless Internet access include WLAN, Wi-Fi, Wi-Fi Direct, Digital Living Network Alliance (DLNA), Wi-Fi, Wi-Fi and Wi-Fi. Interoperability for Microwave Access (WiMAX), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) ), advanced long-term evolution (LTE-A) and the like. The wireless Internet module 113 may transmit or receive data through one or more of these wireless Internet technologies, as well as other Internet technologies. In some embodiments, when wireless Internet access is implemented according to, for example, WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A protocols and the like, as part of a mobile communication network, it is the wireless Internet module 113 that ensures this access to wireless Internet. The short-range communication module 114 is designed to facilitate short-range communications. Technologies for implementing these short-range communications include BLUETOOTH ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, near-field communications (NFC), Wi-Fi, Wi-Fi Direct, wireless USB and the like. The short-range communication module 114 generally supports wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal 100, or communications between the mobile terminal. and a network where there is another mobile terminal 100 (or an external server), via wireless local area networks. An example of wireless LANs is wireless personal networks. Here, another mobile terminal (which can be similarly designed to the mobile terminal 100) can be a worn device, for example, a smart watch, smart glasses or a virtual reality headset, which is capable of exchanging data. with the mobile terminal 100 (or cooperate in another way with the mobile terminal 100). The short-range communication module 114 can detect or recognize the ported device, and allow communication between the ported device and the mobile terminal 100. In addition, when the detected ported device is a device that has been authenticated to communicate with the terminal mobile 100, the controller 180 may, for example, transmit at least a portion of the processed data in the mobile terminal 100 to the device carried via the short-range communication module 114. Thus, a user of the worn device can use the processed data in the mobile terminal 100 on the worn device. For example, when a call is received in the mobile terminal 100, the user can answer the call using the ported device. Similarly, when a message is received in the mobile terminal 100, the user can view the received message using the worn device. The location information module 115 is generally designed to detect, calculate, obtain, or otherwise identify a position (or current position) of the mobile terminal. For example, the location information module 115 includes a global positioning system (GPS) module, a Wi-Fi module, or both. For example, when the mobile terminal uses a GPS module, the position of the mobile terminal can be acquired through a signal sent from a GPS satellite. As another example, when the mobile terminal uses the Wi-Fi module, the position of the mobile terminal can be acquired from information relating to a wireless access point that transmits or receives a radio signal to or from the Wi-Fi module. Where appropriate, the location information module 115 may alternatively or additionally operate with any of the other modules of the wireless communication unit 110 to obtain data relating to the position of the mobile terminal. The location information module 115 is a module that serves to acquire the position (or the current position) and it is not necessarily limited to a module for directly calculating or acquiring the position of the mobile terminal. The input unit 120 may be designed to allow different types of input to the mobile terminal 120. Examples of these inputs include sound, picture, video, data, and user input. An image and video type input is often obtained by means of one or more cameras 121. These cameras 121 can process frames of still image or video images obtained by image sensors in video capture mode. photo capture. The processed image frames may be displayed on the display unit 151 or stored in the memory 170. At the same time, the cameras 121 may be arranged in a matrix configuration to allow the mobile terminal 100 to be entered with a plurality of images with different angles or foci. Cameras 121 may also be placed in stereoscopic arrangement to acquire left and right images to create a stereoscopic image. The microphone 122 transforms an external audio signal into electrical audio (sound) data. The transformed audio data may be processed in a variety of ways depending on the function performed in the mobile terminal 100. Where appropriate, the microphone 122 may include different noise suppression algorithms to eliminate unwanted noise produced during signal reception. external audio. The user input unit 123 is a component that allows a user to make an entry. This user input may enable the controller 180 to control the operation of the mobile terminal 100. The user input imity 123 may include one or more of the following: a mechanical input element (eg, a mechanical key, a button located on a front and / or rear surface or on a side surface of the mobile terminal 100, a dome-type switch, a pulse control wheel, a pulse control switch and the like) or a touch input element, among others. As an example, the touch input may be a function key or a visual key, which is displayed on a touch screen by software processing, or a touch key which is located at a location of the other mobile terminal. than the touch screen. In addition, the virtual key or the visual key may be displayed on the touch screen in various forms, for example, a graphic, text, an icon, video, or a combination of these different forms. The detection unit 140 is generally designed to detect one or more of the following information: internal information on the mobile terminal, information on the ambient environment of the mobile terminal, user information, or the like. The controller 180 generally cooperates with the transmitter unit 140 to control the operation of the mobile terminal 100 or to execute a data processing, a function or an operation associated with an application program installed in the mobile terminal from the detection signal. The detection unit 140 may be implemented by any of a variety of sensors, some of which will now be described in more detail. The proximity sensor 141 is a sensor for detecting the presence or absence of an object approaching a surface, or an object close to a surface, using an electromagnetic field, infrared rays or similar, without mechanical contact. The proximity sensor 141 may be arranged in an internal region of the mobile terminal covered by the touch screen or near the touch screen. The proximity sensor 141 may include, for example, any of the following sensors: a transmissive type photoelectric sensor, a direct reflective type photoelectric sensor, a reflective mirror type photoelectric sensor, a high frequency oscillation proximity sensor , a capacitance type proximity sensor, a magnetic type proximity sensor, an infrared proximity sensor, and the like. When the touch screen implemented is of the capacitance type, the proximity sensor 141 can detect the proximity of a pointer relative to the touch screen due to the variations of an electromagnetic field that reacts to the approach of an object with conductivity. In this case, the touch screen (touch sensor) can also be classified as a proximity sensor. In this document, we will often use the term "touch close" to describe the scenario in which a pointer is placed so as to be close to the touch screen without coming into contact with it. We will often use the term "contact touch" in this document to describe the scenario in which a pointer comes into physical contact with the touch screen. Regarding the position corresponding to the touch near the pointer relative to the touch screen, this position corresponds to a position where the pointer is perpendicular to the touch screen. The proximity sensor 141 can detect a nearby feel and touch profiles in the vicinity (e.g., distance, direction, speed, time, position, motion status, and the like). In general, the controller 180 processes the data that corresponds to nearby touches and proximity touch profiles detected by the proximity sensor 141, and displays visual information on the touch screen. In addition, the controller 180 can control the mobile terminal 100 to perform different operations or to process different data (or information) depending on whether a touch with respect to a point of the touch screen is a touch in the vicinity or a touch to contact. A touch sensor may detect a touch (or touch input) applied to the touch screen, for example to the display unit 151, using any of a variety of touch methods. Examples of such tactile methods include a resistive type, capacitive type, infrared type and magnetic field type, among others. By way of example, the touch sensor may be designed to convert pressure variations applied to a specific part of the display unit 151, or to convert a capacitance appearing in a specific part of the display unit 151. , as electrical input signals. The touch sensor can also be designed to detect not only a touched location and an affected area, but also tactile pressure and / or tactile ability. To apply a touch input to the touch sensor, a touch object is generally used. Examples of typical touch objects are a finger, a touch pen, a stylus, a pointer, or the like. When a touch input is detected by a touch sensor, corresponding signals can be transmitted to a touch controller. The touch controller can process the received signals and then transmit the corresponding data to the controller 180. Accordingly, the controller 180 can detect the region of the display unit 151 that has been touched. Here, the touch controller may be an independent component of the controller 180, the controller 180, or combinations thereof. In parallel, the controller 180 can execute the same commands or different commands depending on the type of touch object that touches the touch screen or a key provided in addition to the touch screen. The execution of the same commands or different commands depending on the object that provides a touch input can be decided, for example, depending on the current state of operation of the mobile terminal 100 or a running application program . The touch sensor and the proximity sensor can be implemented individually, or combined, to detect different types of touch. These touches include a short touch (or tapping), a long touch, multiple touch, a drag-and-drop feel, a scroll-like touch, a zoom-in feel, a zoom-out feel, a touch of drag type, hover type touch and the like. Where appropriate, an ultrasonic sensor may be implemented to recognize positional information relating to a touch object by means of ultrasonic waves. The controller 180 may, for example, calculate the position of a wave generation source based on information detected by a light sensor and a plurality of ultrasonic sensors. Light being much faster than ultrasonic waves, the time it takes for the light to reach the optical sensor is much shorter than the time it takes the ultrasonic wave to reach the ultrasonic sensor. The position of the wave generation source can be calculated from this fact. In this case, the position of the wave generation source can be calculated by using the time offset from when the ultrasonic wave reaches the sensor, the light serving as a reference signal. The camera 121, which has been shown as a component of the input unit 120, generally comprises at least one of a camera sensor (CCD, CMOS etc.) or a photoelectric sensor (or image sensors). a laser sensor. The implementation of the camera 121 with a laser sensor can allow the detection of the touch of a physical object with respect to a stereoscopic 3D image. The photoelectric sensor can be superimposed on the display device or cover it. The photoelectric sensor can be designed to scan the movement of the physical object near the touch screen. If we go into the details, the photoelectric sensor may include photodiodes and row and column transistors for scanning content received in the photoelectric sensor by means of an electrical signal that varies depending on the amount of light applied. In other words, the photoelectric sensor can calculate the coordinates of the physical object according to the variations of light in order to obtain information on the position of the physical object. The display unit 151 is generally designed to deliver processed information in the mobile terminal 100. For example, the display unit 151 may display information of the execution screen of an application program executed on the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information in response to the information of the execution screen. In addition, the display unit 151 may be implemented as a stereoscopic display unit for displaying stereoscopic images. A typical stereoscopic display unit may employ a stereoscopic display system such as a stereoscopic system (glass system), an auto-stereoscopic system (glassless system), a projection system (holographic system) or the like. The audio output module 152 is generally designed to output audio data. This audio data can come from any of several different sources, so that the audio data can be received from the wireless communication unit 110 or may have been stored in the memory 170. The audio data can be transmitted in modes such as a signal receiving mode, a calling mode, a recording mode, a speech recognition mode, a broadcasting receiving mode and the like. The audio output module 152 may provide an audible output relating to a particular function (e.g., a call tone receiving tone, a message receiving tone, etc.) performed by the mobile terminal 100. The output module Audio 152 may also be implemented as a receiver, speaker, horn or the like. A haptic module 153 may be designed to produce various tactile effects that a user feels, perceives or experiences in another way. A typical example of a tactile effect produced by the haptic module 153 is that of a vibration. The power, type and other characteristics of the vibration produced by the haptic module 153 can be controlled through a selection or a user parameter by the controller. For example, the haptic module 153 may emit different vibrations in a combined or sequential manner. In addition to vibration, the haptic module 153 may produce various other tactile effects, including a stimulating effect such as a vertically movable hairpin arrangement for touching the skin, a spraying force, or an air suction force through a jet orifice or suction opening, a touch of the skin, an electrode contact, an electrostatic force, an effect that reproduces the sensation of cold or heat by using an element that can absorb or produce heat, and the like. The haptic module 153 can also be implemented so as to allow the user to feel a tactile effect through a muscular sensation, for example at the user's fingers or arm, as well as transfer the tactile effect by direct contact. Two or more haptic modules 153 may be provided, depending on the particular configuration of the mobile terminal 100. An optical output module 154 may emit a signal to indicate the creation of an event using light from a light source. Examples of events created in the mobile terminal 100 include receiving a message, receiving a call signal, a missed call, an alarm, a calendar notification, receiving an electronic message, receiving information through an application, and the like. A signal emitted by the optical output module 154 may be implemented such that the mobile terminal emits monochromatic light or light having a plurality of colors. The transmitted signal can end as soon as the mobile terminal detects that a user has consulted the created event, for example. The interface unit 160 serves as an interface to allow external devices to be connected to the mobile terminal 100. For example, the interface unit 160 can receive data transmitted by an external device, receive energy from it. transfer to the elements and components in the mobile terminal 100, or transmit internal data from the mobile terminal 100 to the external device in question. The interface unit 160 may comprise wired or wireless headphone ports, external power ports, wired or wireless data ports, memory card ports, connection ports to a device with an identification module, audio input / output (I / O) ports, video I / O ports, headphone ports, or the like. The identification module may be a chip that stores various information for authenticating the use authority of the mobile terminal 100 and may include a user identity module (UIM), an identity module of subscriber (SIM), universal subscriber identity module (USIM) and the like. In addition, the device with the identification module (also called "identification device" in this document) may be in the form of a smart card. Therefore, the identification device can be connected to the terminal 100 via the interface unit 160. When the mobile terminal 100 is connected to an external pedestal, the interface unit 160 can serve as a passageway to allow the power of the pedestal to power the mobile terminal 100 or can be used as a gateway to allow various control signals inputted by the user from the base are transferred to the mobile terminal through it. The various control signals or the energy input from the base can serve as signals to recognize that the mobile terminal is properly attached to the base. The memory 170 may store programs to support operations of the controller 180 and store input / output data (e.g., a directory, messages, still images, videos, etc.). The memory 170 can store data relating to various types of vibrations and audio that are transmitted in response to touch inputs on the touch screen. The memory 170 may comprise one or more types of storage media, in particular a flash memory, a hard disk, an SSD, a silicon disk, a micro-type multimedia card, a card-type memory (for example an SD memory or DX, etc.), random access memory (RAM), static random access memory (SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), memory magnetic disk, a magnetic disk, an optical disk and the like. The mobile terminal 100 may also be used as part of a network storage device that performs the function of storing the memory 170 on a network, for example the Internet. The controller 180 may generally control application program operations and the overall operation of the mobile terminal 100. For example, the controller 180 may activate or unlock a lock state to prevent a user from entering a control command compared to applications when the state of the mobile terminal fulfills a predefined condition. The controller 180 may also perform controls and processing associated with voice calls, data transmissions, video calls, and the like, or perform pattern recognition processing to recognize handwritten input or drawing input made on the screen touch as characters or as images, respectively. In addition, the controller 180 may control one or a combination of these components to implement various exemplary embodiments described herein. The power supply unit 190 receives external energy or provides internal energy and provides the appropriate amount of power required for the operation of the respective elements and components included in the mobile terminal 100. The power unit 190 can include a battery, which is generally rechargeable or can be removably coupled to the terminal body to be recharged. The power supply unit 190 may include a connection port. The connection port may be designed as an example of the interface unit 160 to which an external charger is electrically connected to provide power to recharge the battery. As another example, the power supply unit 190 may be designed to recharge the battery in a non-wired manner, without using the connection port. In this example, the power supply unit 190 can receive power, transferred by an external wireless power transmitter, using at least one of an inductive coupling method based on magnetic induction, or a coupling method. magnetic resonance based on electromagnetic resonance. Various embodiments described herein may be implemented on a computer readable medium, a machine readable medium or a similar medium by, for example, software, hardware or any combination thereof. this. Turning now to Figs. 1B and 1C, the mobile terminal 100 is described as a bar-type terminal body. However, the mobile terminal 100 may also be implemented in any of a number of different configurations. Examples of such configurations include the watch type, the clip type, the goggle type, or the collapsible type, the flip type, the slide type, the tilt type, and the pivot type in which two bodies or more are combined with each other in a relatively mobile manner, and combinations thereof. The explanations provided in this document will often relate to a particular type of mobile terminal (for example, the bar type, the watch type, the glasses type, and the like). However, these explanations relating to a particular type of mobile terminal generally also apply to other types of mobile terminals. Here, if we consider that the mobile terminal 100 is at least one set, the body of the terminal can be likened to a design relating to this set. The mobile terminal 100 generally comprises a housing (for example a frame, a housing, a cover and the like) which forms the appearance of the terminal. In this embodiment, the housing is formed using a front housing 101 and a rear housing 102. Various electronic components are integrated in a space formed between the front housing 101 and the rear housing 102. At least an intermediate casing may be further placed between the front casing 101 and the rear casing 102. The display unit 151 is shown located on the front side of the terminal body for displaying information. As seen in the drawing, a window 151a of the display unit 151 may be mounted on the front housing 101 to form the front surface of the terminal body together with the front housing 101. In some embodiments, electronic components may also be mounted on the rear case 102. Examples of such electronic components include a removable battery 191, an identification module, a memory card, and the like. A back cover 103 is shown as covering the electronic components, and this cover can be detachably coupled to the rear case 102. Therefore, when the rear cover 103 is detached from the rear case 102, the electronic components mounted on the rear case 102 are exposed externally. As we see in the drawing, when the rear cap 103 is coupled to the rear casing 102, a side surface of the rear casing 102 is partially exposed. In some cases, at the time of coupling, the rear case 102 may also be completely protected by the back cover 103. In some embodiments, the back cover 103 may include an opening for externally exposing a camera 121b or a display module. 152b audio output. The housings 101, 102, 103 may be formed by injection molding of a synthetic resin or may be formed of a metal, for example stainless steel, aluminum (Al), titanium (Ti) or the like. As an alternative to the example in which the plurality of housings form an internal space for housing components, the mobile terminal 100 may be designed such that a single housing forms this internal space. In this example, a mobile terminal 100 having a one-piece body is formed such that the synthetic resin or metal extends from a side surface to a back surface. If necessary, the mobile terminal 100 may comprise a sealing unit (not shown) which makes it possible to prevent water from entering the body of the terminal. For example, the sealing unit may comprise a sealing element which is located between the window 151a and the front casing 101, between the front casing 101 and the rear casing 102, or between the rear casing 102 and the rear cover 103, so as to seal an internal space when these housings are coupled. The mobile terminal 100 may comprise a display unit 151, a first and a second audio output module 152a and 152b, a proximity sensor 141, a lighting sensor 142, an optical output module 154, first and second cameras 121a and 121b, first and second handling units 123a and 123b, a microphone 122, an interface unit 160, and the like. Hereinafter, as shown in FIGS. 1B and 1C, we will describe a representative mobile terminal 100 in which the front surface of the terminal body is shown as having the display unit 151, the first audio output module 152a, the proximity sensor 141, the illumination sensor 142, the optical output module 154, the first camera 121a, and the first manipulation unit 123a, the lateral surface of the terminal body is shown as having the second manipulation unit 123b , the microphone 122, and the interface unit 160, and the rear surface of the terminal body is shown as having the second audio output module 152b and the second camera 121b. Figures IB and IC represent certain components as being arranged on the mobile terminal. However, it is to be understood that other arrangements are possible within the scope of the teachings of the present invention. Some components may be deleted or rearranged. For example, the first handling unit 123a may be on another surface of the terminal body, and the second audio output module 152b may be on the side surface of the terminal body. The display unit 151 displays information processed in the mobile terminal 100. For example, the display unit 151 may display run screen information of an application program executed on the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information in response to the execution screen information. The display unit 151 may be implemented by means of one or more suitable display devices. Examples of such suitable display devices include a liquid crystal display (LCD), a thin-film transistor (TFT-LCD) and liquid crystal display, an organic light-emitting diode (OLED), a flexible display, a three-dimensional screen (3D), an electronic ink display device, and combinations of these different devices. The display unit 151 may be implemented by means of two display devices that can implement the same or different display technologies. For example, a plurality of display units 151 may be arranged on one side; or they may be separated from one another, or these devices may be integrated, or they may be arranged on different surfaces The display unit 151 may also include a touch sensor which detects a touch input received in the display unit. display. When entering a touch in the display unit 151, the touch sensor may be designed to detect that touch and the controller 180 may, for example, issue a control command or other signal corresponding to the touch. Content entered by touch can be a text or numeric value, or a menu item that can be specified or designated in different modes. The touch sensor may be designed as a touch-patterned film disposed between the window 151a and a display on a rear surface of the window 151a, or a patterned wire formed directly thereon. on the rear surface of the window 151a. Alternatively, the touch sensor may be formed integrally with the display. For example, the touch sensor may be disposed on a substrate of the display device or in the display device. The display unit 151 may also form a touch screen together with the touch sensor. Here, the touch screen can serve as a user input unit 123 (see Figure IA). Thus, the touch screen can replace at least part of the functions of the first handling unit 123a. The first audio output module 152a may be implemented as a receiver for transferring ring tones from a user and the second audio output module 152b may be implemented in the form of a receiver. a loudspeaker for emitting audible alarms, multimedia audio reproduction and the like. The window 151a of the display unit 151 generally includes an opening for allowing the audio signals produced by the first audio output module 152a to pass. One possibility is to allow the audio signals to diffuse along an assembly gap between the structural bodies (for example, a gap between the window 151a and the front housing 101). In this case, an independently formed hole for emitting sounds may be invisible or otherwise hidden from view, further simplifying the appearance and fabrication of the mobile terminal 100. The optical output module 154 may be designed to emit light to indicate the creation of an event. Examples of such events include receiving a message, receiving a call signal, a missed call, an alarm, a calendar notification, receiving an electronic message, receiving information through an application and the like. Once the user has viewed the created event, the controller may control the optical output unit 154 to terminate the light emission. The first camera 121a can process frames of images such as still or moving images obtained by the image sensor in a capture mode or a video call mode. The image frames processed can then be displayed on the display unit 151 or stored in the memory 170. The first and second handling units 123a and 123b are examples of the user input unit 123 that can be manipulated by a user to provide an input to the mobile terminal 100. The first and second handling units 123a and 123b may also be commonly referred to as the manipulation part, and may employ any tactile method that allows the user to perform manipulation such as touch, pressure, scrolling or the like. The first and second handling units 123a and 123b may also employ any non-tactile method that allows the user to perform such manipulation as a nearby touch, an overflight, or the like. Fig. 1B shows the first manipulation unit 123a as a touch key, but other options are possible, including a mechanical key, a key, a touch key, and combinations thereof. An input received in the first and second handling units 123a and 123b can be used in various ways. For example, the first handling unit 123a may be used by the user to provide a menu entry, a home key, a cancel, a search, or the like, and the second handling unit 123b may be used by the user. user to provide an input to control the sound volume delivered by the first or second audio output modules 152a or 152b, to switch to touch recognition mode of the display unit 151, or the like. As another example of the user input unit 123, a rear input unit (not shown) may be located on the rear surface of the terminal body. This rear input unit may be manipulated by a user to provide an input to the mobile terminal 100. The input may be used in different ways. For example, the rear input unit may be used by the user to provide an input to turn on or off, go to the beginning, go to the end, scroll, control the sound volume delivered by the first or second output modules 152a or 152b audio, switch to touch recognition mode of the display unit 151, and the like. The rear input unit can be designed to allow touch input, pressure input, or combinations of these inputs. The rear input unit may be positioned to cover the display unit 151 on the front side in the thickness direction of the terminal body. By way of example, the rear input unit may be on an upper end portion of the rear side of the terminal body so that a user can easily manipulate it with his index finger when the user enters the body of the terminal. terminal of a hand. Alternatively, the rear input unit may be located virtually anywhere on the back side of the terminal body. Embodiments that include the rear input unit may implement some or all of the functionality of the first handling unit 123a in the rear input unit. Thus, in cases where the first handling unit 123a has been removed from the front side because the touch screen or the rear input unit replaces at least a portion of the functions of the first handling unit 123a provided for in FIG. before the terminal body, the display unit 151 may have a larger screen. As another possibility, the mobile terminal 100 may comprise a fingerprint sensor that scans the fingerprint of the user. The controller 180 can then use the fingerprint information detected by the fingerprint sensor as part of an authentication procedure. The fingerprint sensor may also be installed in the display unit 151 or implemented in the user input unit 123. The microphone 122 is shown at one end of the mobile terminal 100, but other locations are possible. If necessary, several microphones can be implemented, such an arrangement allowing the reception of stereo sounds. The interface unit 160 may serve as a path for allowing the mobile terminal 100 to interface with external devices. For example, the interface unit 160 may comprise one or more of the following: a connection terminal for connecting to another device (for example, a listener, an external speaker, or the like), a port of near-field communication (e.g., an infrared data association port (IrDA), a Bluetooth port, a wireless LAN port, and the like), or a power supply terminal for powering the mobile terminal 100. interface unit 160 may be implemented in the form of a socket for receiving an external card, such as a subscriber identification module (SIM), a user identity module (UIM) ), or a memory card for storing information. The second camera 121b is shown located on the rear side of the terminal body and includes an image capture direction substantially opposite to the image capture direction of the first camera unit 121a. The second camera 121b may include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix configuration. Cameras can be called an "online camera". When the second camera 121b is implemented as an on-line camera, images can be captured in different ways using the plurality of lenses and the images are of better quality. A flash 124 is shown next to the second camera 121b. When capturing an image of a subject with the camera 121b, the flash 124 may illuminate the subject. The second audio output module 152b may be on the body of the terminal. The second audio output module 152b may implement stereophonic sound functions in conjunction with the first audio output module 152a, and may also be used to implement a hands-free telephone mode for communications. At least one antenna for wireless communications may be on the body of the terminal. This antenna can be installed in the body of the terminal or formed by the housing. For example, an antenna that forms part of the broadcast receiving module 111 may enter the body of the terminal. Alternatively, an antenna may be formed by means of a film attached to an inner surface of the back cover 103, or a housing that includes a conductive material. A power supply unit 190 for powering the mobile terminal 100 may include a battery 191, which is mounted in the body of the terminal or releasably coupled to the outer portion of the terminal body. The battery 191 can receive power via a power cable connected to the interface unit 160. In addition, the battery 191 can be wirelessly recharged by means of a wireless charger. Wireless charging can be implemented by magnetic induction or electromagnetic resonance. The back cover 103 is shown coupled to the rear case 102 to protect the battery 191, to prevent separation of the battery 191, and to protect the battery 191 against external shocks or foreign bodies. When the battery 191 can be detached from the terminal body, the rear case 103 can be detachably coupled to the rear case 102. An accessory that makes it possible to protect the appearance of the mobile terminal 100 or to facilitate or extend its functions can also be provided on the mobile terminal 100. As an example of an accessory, a cover or a cover for covering or receive at least one surface of the mobile terminal 100 may be provided. The cover or the cover can cooperate with the display unit 151 to extend the functions of the mobile terminal 100. Another example of this accessory is a touch pen for facilitating or extending a touch input to a touch screen. Fig. 2A is a flow chart showing a method of controlling a mobile terminal according to an exemplary embodiment of the present invention, and Figs. 2B to 2D are conceptual views showing the control method of Fig. 2A. The display unit 151 displays screen information 510 that corresponds to a specific function (S210). There is no limit on the type of screen information 510. For example, the screen information may be the execution screen of a specific application, the setting screen of a function, the screen, or the screen. screen of a home page including icons corresponding to applications, a web page and the like, without limit. Figure 2B shows a parameter screen for changing the parameters of the components of the mobile terminal. The controller 180 detects changes in the screen information or error information in the screen information as it controls the specific function (S220). Here, the error information may include a situation in which a user has difficulty in using the specific function, in particular, the case where he fails to control the specific function in the desired manner, as well as a lack of control and an abnormal (or erroneous) response to an input control command. As we see in Fig. 2B, when an error occurs from the user control command controlling the specific function, the controller 180 commands the display unit 151 to display a window. notification 501 to signal the occurrence of the error. The notification window 501 may include information relating to the specific function, error information, the error code number, the error identification number, and the like. The notification window 501 may be displayed in a region of the screen information 510. Referring to Fig. 2C, when the screen information 510 corresponding to the specific function is repeatedly displayed for a predefined duration, the controller 180 may determine that the error has occurred from the specific function. For example, when it fails to execute the specific function, the user proceeds to an operation that consists of terminating the specific function and restarting it. In this case, the display unit 151 repetitively displays substantially the same screen information. When the same screen information is repetitively displayed a predefined number of times in a predefined period of time, the controller 180 determines the occurrence of the error. When the occurrence of the error has been determined through the notification window 501, the controller 180 commands the display unit 151 to display a first icon 601. The first icon 601 can be displayed in a region of the display unit 151 in a region adjacent to the notification window 501. Returning to FIG. 2B, the controller 180 generates a capture image 510 'obtained by capturing the screen information 510 including the notification window 501, in response to a touch applied to the first icon 601. The unit of display 151 may display an image indicating the creation of capture image 510 '. The controller 180 extracts text (letters, a sentence, etc.) from the capture image 510 '. For example, the controller 180 can extract "error: 1499", "OK", "inquiry" and the like that are included in the capture image 510 '. When it turns out that the error has occurred, the controller 180 executes a specific application to resolve the error after extracting the terms. The specific application can be executed from a control command applied by the user. The specific application searches for an error in the mobile terminal and provides a search result for the error. The controller 180 provides the result of searching for the error by receiving it from a specific server or by extracting it from the memory 170 where it is stored. The drawings show examples in which the application is executed in response to a touch applied to the first icon 601 after displaying the first icon 601, but the present invention is not necessarily limited thereto. For example, when the notification window 501 is repeatedly displayed, the specific application can run automatically without a touch being applied to the first icon 601. The controller 180 displays a guidance image for searching for the error data when it turns out that the error associated with the specific function has occurred (S230). Once the terms are extracted, the controller 180 can enter at least some of the extracted terms as a search term. The controller 180 can selectively define as a search term one or more extracted text words relating to the error, or selectively define as a search term only the text contained in the notification window 501. The controller 180 may control the memory 170 to store (or save) the capture image 510 'as it executes the application. The controller 180 searches for the error data 610 by the search term and displays the wanted error data 610 on the display unit 151. The display unit 151 can display the error data. 610 by sorting them by category. For example, the error data 610 may include a first search result 611 giving an explanation of the error sought from the search term, and a second search result 612 including a description (or an instruction manual). to resolve the error and execute the specific function. According to the present invention, when a repetitive error occurs, an application for resolving the error is executed without even a user executing an error solving program or the like. This can lead to the deletion of a command step in which the user completes a specific function and executes an application to resolve the error. In addition, since an error search text is extracted through a capture image including an error occurred, the user does not need to enter this text directly to search for the error, this may allow the user to have data for the error more conveniently. To return to FIG. 2C, when the repetitive screen information is displayed, the controller 180 commands the display unit 151 to display a second icon 602. Although it is not shown in detail, this second icon 602 may disappear from the display unit 151 when no touch has been applied for a specific duration or the repetitive output state is terminated. Indeed, the controller 180 can determine this as being an intentional command of the user, and not the occurrence of the error. The shape of the second icon 602 is not necessarily limited to the one shown. The second icon 602 is preferably displayed in a form that does not overlap the screen information 510. The controller 180 executes the specific application in response to a touch applied to the second icon 602. The controller 180 may define a text included in the screen information repetitively displayed 510 as a search term, or define a text extract on the capture image 510 'of the screen information 510 as a search term. In addition, the controller 180 may generate the error data 610 with additional information relating to the application associated with the screen information 510. According to this embodiment, when the user repetitively displays the same screen information, this action is similar to the occurrence of an error and there is provided a method of rapid execution of an application to solve the problem. 'error. Furthermore, when it turns out that the repetitive screen information is displayed by the user's will, the display of an icon may be limited to improve the comfort of the user in ordering the information. screen. If we look at Fig. 2D, the controller 180 can execute the specific application from a voice signal applied through the microphone 122. The controller 180 can activate the microphone 122 at a predefined time interval and control the activated microphone 122 to receive the voice of the user. Otherwise, when specific screen information is displayed continuously or repeatedly, the controller 180 may activate the microphone 122. The controller 180 analyzes the received voice through the microphone 122. When the received voice corresponds to a predefined speech signal, the controller 180 determines that an error occurred during the execution of the specific function and then executes the application. For example, the voice signal may be a voice stating a word (or phrase) about an error, a sigh, and the like. In addition, a voice signal that the user frequently emits (or speaks) when an error occurs may be stored in the memory 170. When a voice signal similar to the recorded speech signal is inputted, the controller 180 may determine that error occurred. When the occurrence of the error is detected from the received voice, the controller 180 can control the display unit 151 to display the second icon 602. The second icon 602 having been described in FIG. will not repeat his description. The display unit 151 displays the error data 610 in response to a touch applied to the second icon 602. According to this embodiment, when an error occurs, the occurrence of the error can be determined from a voice that the user applies (or states) when he realizes it. This can provide a method that resolves the error even when the user does not intend to solve it. If we look at FIG. 2E, once capture image 510 'has been stored in memory 170, a specific application can be executed by the user. Execution screen 620 may include an image icon 622 for selecting stored images as the search target. The image icon 622 may be displayed next to an input window 621 for the user to enter a text search term. The controller 180 instructs the display unit 151 to display a plurality of images stored in the memory 170, in response to a touch applied to the image icon 622. The controller 180 can selectively extract from the memory 170 at least a portion of a captured image of a screen on the display unit 151, an image stored when an error has occurred, and a stored image together with error information, and displaying the at least a portion extracted from these images on the display unit 151. Otherwise, the controller 180 may execute another application (eg a gallery application) that stores the images. The controller 180 may extract a text from the selected image from the plurality of images and enter the extracted text into the input window 621. According to this embodiment, the user can capture and store an image displayed on the display unit when an error occurs, and then choose the stored image after execution of a specific application. The user can store the information related to the error and then resolve the error through the specific application after a lapse of time. The foregoing description relates to a control method for providing an application execution icon for solving an error when it is found that the error has occurred. Hereinafter, we will proceed to describe a control method for providing guidance information for solving an error while providing screen information. Figs. 3A to 3D are conceptual views showing a control method for providing guidance information upon occurrence of an error according to various exemplary embodiments of the present invention. As we see in Figure 3A, while screen information 520 for a specific function is displayed on the display unit 151, the controller 180 can determine that an error has occurred. For example, the controller 180 may determine that the error occurred when the screen information 520 is displayed repeatedly, when the screen information 520 is displayed continuously for a predefined time, or when a notification window including an error message is displayed. The controller 180 can execute the specific application immediately when the error occurs. The controller 180 may activate the microphone 122 to receive the voice of a user in response to the execution of the specific application. When the application is executed, the display unit 151 displays a guide image 603 in a region of the screen information 520. The guide image 603 may include a text allowing the user to check whether an error occurred or not, from the specific function. In addition, the controller 180 may control the audio output module 152 to transmit a voice signal to verify the occurrence or non-occurrence of the error. The voice signal can be a voice in which the text has been converted. While the guide image 603 is displayed on the display unit 151 in response to the execution of the specific application, a user touch is applied to the display unit 151. allowing the user to continuously control the specific function by applying a touch to the display unit 151 while the guide image 603 is displayed. The guidance image 603 may be displayed at an upper end of the screen information 520, but is not necessarily limited thereto. Indeed, the guide image 603 may be displayed in a region of the screen information 520 but is preferably displayed in semi-transparency so that the screen information 520 is visible through the guide image. 603. The guide image 603 may change size depending on the guidance information it provides. The microphone 122 receives the voice of the user during the execution of the specific application. For example, the controller 180 may activate the microphone 122 in response to a touch applied to the guide image 603, and recognize the voice of the user through a natural language recognition function. The display unit 151 may display the text, which has been obtained by transforming the voice received through the microphone 122 by natural language recognition, into the guide image 603. The controller 180 searches for error data to resolve the error that has occurred using the text according to the user's voice, and delivers at least a portion of the error data. The audio output module 152 may output the at least a portion of the error data as a voice signal, and the display unit 151 may output the at least a portion of the error data under form of textual information. Although not shown in detail, the controller 180 may terminate the application to resolve the error in response to a preset touch applied to the guide image 603. If we look at FIG. 3B, the controller 180 can provide sequential guidance information according to the control steps of the specific function. The controller 180 can control the specific function being executed, in response to a touch applied to the display unit 151 while the guide image 603 is displayed in a region of the display unit 151. Therefore, the user can apply a control command on the basis of the guide image 603 displayed on the display unit 151 or a guide voice outputted through the audio output module 152. The controller 180 controls the specific function from the control command and provides additional guidance information on the guidance image 603 based on the input user's voice and control command status. In addition, the audio output module 152 may output a voice signal for the additional guidance information. Although not shown in detail, when it turns out that a function activated as a result of a voice signal initially inputted by the user has been executed, the controller 180 may terminate the specific application. . As a result, the guide image 603 displayed on the display unit 151 may disappear. In addition, once the guide image 603 has disappeared, the display unit 151 may further display an icon that receives a touch to terminate the activated application. In this case, the user can terminate the output of the guidance information before the specific function entered through the voice is completed. According to this embodiment, the user can have guidance information that will facilitate a control operation while viewing the screen information currently under control. In addition, the user can perform functions used step by step while continuously viewing the guidance information. This allows the user to immediately perform a desired function without having to write or remember the guidance information. In addition, error information can be delivered through a voice, which can lead to an error being solved without even entering text or terms in the screen information. If we look at FIG. 3C, when a voice recognition application for performing a desired function by input of a voice is executed, the controller 180 can control the display unit 151 to display guidance information. 604 previously displayed. When the voice recognition application is executed as a result of the occurrence of an error, the controller 180 may control the memory 170 to store the delivered guidance information. The controller 180 may search for the guidance information stored in the memory 170 from the specific function that executes while the voice recognition application is in progress. The guidance information may include information relating to a control method for resolving an error associated with the specific function. The controller 180 instructs the display unit 151 to display notification information 604 'relating to the desired guidance information 604. This notification information 604 'may include a stored date (saved) of the guidance information, a description (keyword) of the function associated with the guidance information, and a graphic image for receiving a touch to display the information guidance. The controller 180 may provide information relating to sequential control steps for executing the specific function or complete (integer) data indicating that the voice signal inputted by the user and the created guidance information is arranged sequentially, based on a choice made by the user. The complete data may be displayed as textual information on the display unit 151. The controller 180 can control the display unit 151 and the audio output module 152 to sequentially provide the control steps for controlling the specific function associated with the guidance information. According to this embodiment, when a speech recognition application is executed by a user during the execution of a specific function, the error data that has been provided to the user can be provided so as to quickly resolve a problem. error of the specific function. If we look at the 3D figure, the controller 180 commands the display unit 151 to display a notification window 502 to signal an error that occurred during the execution of the specific function. The controller 180 determines that the error occurred when the notification window 502 is displayed and then executes a specific application. In response to the execution of the specific application, the controller 180 instructs the display unit 151 to display an execution screen 620 in a region of the display unit 151. For example, The display unit 151 may be divided so that the screen information 510 and the run screen 620 may be displayed together or the run screen 620 may be displayed in semi-transparency. The controller 180 may adjust the size of the run screen 620 or terminate the application, in response to a touch applied to an edge of the run screen 620. The run screen 620 may include at least a graphical image for receiving a touch to provide error data, and an input window 621 for allowing the user to enter information relating to the error. The controller 180 extracts terms associated with the error in the notification window 502, in response to a continuous touch that was initially applied in the notification window 502 and then released on the run screen 620. The controller 180 instructs the display unit 151 to display the extracted terms in the input window 621 when the continuous touch (drag-and-drop) is released. The display unit 151 instructs the display unit 151 to display the run screen 620 in its entirety when the extracted terms are entered in the input window 621. Otherwise, when a touch is applied to the screen information 510, the controller 180 can extract the terms from the screen information 510 and enter the extracted terms into the input window 621. In this case, the controller 180 can set a certain range in the screen information 520 by the user's touch input. While the terms are displayed in the input window 621, the controller 180 may perform a desired function (e.g. error-related search, providing a user manual, searching for questions and answers about the error, etc.) of the specific function. In parallel, when no term is extracted from the notification window 502 or when there is no possible text entry in the input window 621, the controller 180 can control the display unit 151 to maintain the simultaneous display status of the screen information 520 and the run screen 620. According to this embodiment, when an error occurs, a user can immediately dispose of the execution screen of an application for solving the error. In addition, the user may be allowed to enter the error information he wishes to search based on a simple touch input. Figs. 4A-4C are conceptual views showing a method of providing error data using a capture image. As we see in FIG. 4A, while a notification window is displayed indicating the error that has occurred, the controller 180 captures the screen information 510 including the notification window from a control command for capturing an image displayed on the display unit 151. The controller 180 controls the memory 170 to store a capture image 510 'of the screen information 510. The controller 180 commands the display unit 151 to no longer display the notification window from the control command. The display unit 151 only displays the screen information 510 of the specific function, and receives a touch from the user to control the specific function. However, in this case, the display unit 151 displays a first icon 601 for receiving a touch to provide error data associated with the previously occurred error. When the touch is applied to the first icon 601, the controller 180 extracts a text from the capture image 510 '. The controller 180 then executes a specific application to resolve the error and instructs the display unit 151 to display a run screen 620. The run screen 620 includes an input window 621 for receive an entry from a search term. The controller 180 enters the extracted text as a search term and performs a search. At the same time, the controller 180 can control the screen information from a touch applied to the screen information 510 displayed along with the first icon 601. The controller 180 can control the display unit 151 to not displaying the first icon 601 when the screen information 510 changes in response to touch. According to this embodiment, when an error occurs, the user may first store the error that occurred as an image and resolve the error in another way before searching for the error data. However, if this other way is unknown or the error is not resolved, the user can tap the icon to receive the error data using the stored image. If we look at Fig. 4B, the controller 180 stores the screen information as a capture imager 510 'and also stores the operating log information in the mobile terminal 100 while the screen information 510 is set. displayed. Here, the operating log information may correspond to a server accessed by the user, information about data transmitted and received by the user through a radio signal, information about stored data and the like. When the capture image 510 'appears, the controller 180 commands the display unit 151 to display detailed information (e.g., storage time, size, media type, etc.). relating to the capture image 510 '. However, in this case, the log information is not displayed. When a specific signal is inputted or an entry in a specific space of the mobile terminal is detected, the controller 180 can activate an input window 531 allowing the user to enter password information to display the entries. newspaper information. For example, the specific space may correspond to the after-sales service of a manufacturer used to repair the mobile terminal and the like, but the present invention is not necessarily limited to that. In addition, the password information may correspond to the identification of information relating to the personnel of the after-sales service. Otherwise, the user can enter information about a user-defined password in the input window 531. The controller 180 may control the display unit 151 to display the log information along with the detailed information from the password information. With the present invention, it is possible to solve the problem of breach of the privacy related to the log information created during the use of the mobile terminal of a user and to increase the possibilities of solving an error by providing log information only to the manufacturer's after-sales service. If we look at FIG. 4C, the controller 180 generates the capture image 510 'from a control command for capturing the screen information that is displayed on the display unit 151. display 151 displays a plurality of icons for modifying and storing the capture image 510 '. The plurality of icons may include an error icon for storing the capture image 510 'together with error information. The controller 180 may store the capture image 510 'together with information relating to a specific function being performed, in response to a touch applied to the error icon. The capture image 510 'is stored together with other images. For example, the display unit 151 may display thumbnail images corresponding to images stored in the memory 170 and the capture image 510 '. The controller 180 may display in its entirety an image stored in the memory 170 on the display unit 151, in response to a touch applied to a thumbnail image of the image. However, the controller 180 executes a specific application to resolve the error, in response to a predefined touch input applied to the capture image 510 ', and displays a run screen 620. Specifically, the controller 180 can execute the specific application when a given image is stored together with a specific function or specific error information. The controller 180 may allow the user to enter the text extracted from the capture image 510 'or to enter information relating to the specific function stored together with the image or error information, in the entry window 621 of the execution screen 620. According to this embodiment, when an error occurs, the user can store the error that occurred as a capture imager, and more easily execute an application to resolve the error using the image of stored capture. Fig. 5 is a conceptual view showing a control method for executing a specific application according to another exemplary embodiment of the present invention. When an error occurs, the display unit 151 displays a notification window 604 to signal the error occurred. The notification window 604 may include an icon 604 'corresponding to an application for providing error data. The controller 180 executes the application in response to a touch applied to the icon 604 ', and instructs the display unit 151 to display text in the notification window 604 on the run screen 620. The controller 180 extracts the text in the notification window 604 and performs a search by entering the extracted text as a search term. Although it does not appear in the drawing, when a plurality of notification windows are displayed on the display unit 151, each notification window may include the icon. In this case, the user can obtain the error data of each error information by applying a touch to the icon in question. This can allow the user to individually check the error data for each notification window. Figs. 6A-6D are conceptual views showing a control method for providing error data according to various exemplary embodiments of the present invention. As we see in Figure 6A, an application run screen 630 includes an input window 631 in which the error information is entered as a search term. Text can be entered in the input window 631. This text can correspond to text extracted from a notification window reporting screen information corresponding to a specific function or error information, or can be entered by an user. Even if the text extracted by the controller 180 is displayed in the input window 631, the text can be modified by the user. Execution screen 630 includes a first region 632 and a second region 633. The first region is provided with a first search result including previously stored questions and their responses, searched by the search term, and a second search result including the instructions for use of a specific function, searched for by the search term. In addition, the second region 633 includes video information. The video information may correspond to a specific server, for example, the server of the manufacturer from which the mobile terminal is derived. Otherwise, the controller 180 may search the Internet using the search term. Thus, videos associated with similar errors that have been uploaded by other users can be provided. If we look at Fig. 6B, the controller 180 can provide associated search terms 634. These associated search terms can provide text related to the error as well as text similar to the text entered in the input window 631. . Thus, the number of errors that are caused when extracting text from an image can be reduced. If the user can not search for desired error data, he can search more quickly for a larger amount of data. If we look at FIG. 6C, the controller 180 can generate the search term by extracting a portion of the text extracted from the image. For example, when the text is extracted by sentence from the image by the controller 180, a response 653 that has a strong correspondence with the extracted sentence may be displayed at the top of the execution screen. For example, response 635 may correspond to error data including the extracted sentence. In this case, there is no limit to the category of the response with the highest match value. If we look at Figure 6D, when capture image 510 'appears, controller 180 searches for at least one image 640 which is identical to at least a portion of capture image 510' and is therefore accepted as similar to capture image 510 ', and displays the at least one searched image 640. The at least one image 640 corresponds to an image relating to the error data. For example, when a touch is applied to the at least one image 640, a user manual 643 and video information 642 may be displayed. The at least one image may correspond to an image in the user manual, a thumbnail image of a video file, and the like. According to this embodiment, the user can more quickly search for desired error data from a similarity of images, without the step of extracting a text from the image. Figs. 7A and 7B are conceptual views showing a control method for providing guidance information. As we see in Figure 7A, the controller 180 displays a run screen 630 including error data based on a search term by previously displayed error information in an execution screen 620. The controller 180 instructs the display unit 151 to display a first guide screen 651, in response to a selection touch of an error data. For example, when a user manual is selected by the touch input, the guidance screen may include the user manual displayed sequentially. Meanwhile, when video error data is selected, the first guidance screen 651 may be a video screen that is reproduced. However, the video according to this embodiment preferably corresponds to a video screen provided by the mobile terminal's after-sales service server. The first guidance screen 651 may include an application icon 651a that receives a touch to control the mobile terminal based on the error data. The controller 180 instructs the mobile terminal to perform the control steps in the error data in response to a touch applied to the application icon 651a. Otherwise, the controller 180 may perform a desired control step by the user using both the input error information and the error data. For example, when a step of choosing one of a plurality of contents is in the control steps, the controller 180 may select content that corresponds to the error information. Otherwise, when a step requiring user input is included, the controller 180 can stop the execution of the command steps and cause the display unit 151 to display notification information to the user. . The display unit 151 may display a result screen once the control steps have been completed. According to this embodiment, the user can quickly perform a desired control step without having to read all the error data. Although not detailed in the drawing, the controller 180 may control the display unit 151 to sequentially display screen information indicating the control steps. Otherwise, the controller 180 may perform the control steps while the user is running an application, and control the display unit 151 to display a completion message after the control steps have been performed. screen. If we look at Figure 7B, the controller 180 instructs the display unit 151 to display a second guide screen 652, in response to a touch applied to the run screen 630 including the error data. This second guidance screen 652 includes a guidance icon 652a for displaying the error data sequentially. The controller 180 sequentially displays the error data from a touch applied to the guidance icon 652a. The controller 180 divides the error data of each control step and sequentially displays a portion of the divided error data. While the first divided error data 652b is displayed, the display unit 151 receives a touch input. The controller 180 modifies the screen information 510 according to the touch input. The first error data 652b may be displayed in semi-transparency so that the screen information 510 can be viewed through them. In addition, the controller 180 can stop displaying the error data from a touch applied to the first error data 652b. When the screen information 510 changes, the controller 180 instructs the display unit 151 to display second error data 652c corresponding to the next control step of the first error data 652b. Although not detailed in the drawing, the controller 180 can control the audio output module 152 to output the error data as a voice signal. Figs. 8A to 8C are conceptual views showing a control method for storing error information in real time. As we see in FIG. 8A, when it turns out that an error occurred during the execution of a specific function, the controller 180 commands the display unit 151 to display an icon of For example, the controller 180 may determine that the error occurred when substantially the same screen information is displayed repeatedly, when a notification window signaling the occurrence of an error is displayed, and when an expected control command has not been applied for a predefined time. The record icon 605 may be placed in a region of the display unit 151, and its output position may change depending on the configurations that appear in the screen information. In addition to the registration icon 605, the display unit 151 may display an instruction window 605 'including a description indicating that the changes of the screen information displayed on the display unit 151 are stored. (recorded) according to a certain period of time. The instruction window 605 'may only be displayed for a predefined duration (several seconds) according to a user parameter. The controller 180 commands the memory 170 to store (record) the changes of the screen information displayed on the display unit 151, in response to a touch applied to the recording icon 605. During the recording of As the screen information changes, the display unit 151 receives a touch input and a specific function corresponding to the screen information is controlled from the received touch input. The display unit 151 may also display a lapse of time, starting from the moment of the recording, at the same time as the recording icon 605. When a touch is again applied to the icon recording 605, this terminates the recording of changes to the screen information. The controller 180 commands the memory 170 to store the recorded file. The memory 170 may temporarily store the recorded file and also store information relating to the specific function performed and the user's log information. Once the product registration file is complete, the display unit 151 displays a confirmation window 660 in which the user confirms whether the recorded file has been transmitted to an external device. Thus, the user can transmit the saved file immediately to the external device or share it on a server. According to the present invention, since an error occurrence process can be stored with respect to a lapse of time, commands executed by the user can be stored sequentially. This can facilitate the recognition of the origin of an error. As we see in FIG. 8B, when it turns out that the error has occurred, the controller 180 commands the display unit 151 to display a third icon 606. The controller 180 transmits a request signal for a video call to a predefined external device in response to a touch applied to the third icon 606. Alternatively, a video call function may be executed and a list for selecting an external device to which a request signal has been transmitted may be edited, in response to a touch applied to the third icon 606. When the video call is established, the controller 180 displays a call screen 710 including information relating to the video call. The display unit 151 is divided into first and second regions, and displays a first image 711 as a captured image (or previously stored image) of the user, and a second image 712 as a captured image. (or previously stored image) of a third (called) on the first and second divided regions, respectively. The call screen 710 includes a switch icon 713 for selecting a camera to capture the user's face from the first and second cameras 121a and 121b. When a plurality of touches are applied to the switch icon 713, the controller 180 determines that an error has occurred and instructs the display unit 151 to display previously displayed screen information 714 in the first one. region. The controller 180 controls the screen information 714 according to the touch applied to the first region, and instructs the wireless communication unit 110 to continuously transmit the screen information changes 714 to the external device. Thus, the third party may have in real time screen information, which is currently displayed in the first region of the display unit 151 of the mobile terminal, rather than capturing the user's face (or captured subject). through a camera). Otherwise, the controller 180 may display screen information 714 'in full on the display unit 151. A modified image 712' of the second image may be displayed in a region of the display unit 151. Even in this case, the screen information may vary in response to the user's touch inputs and variations of the screen information are transmitted to the external device in real time. At the same time, the call screen 710 includes a command icon separate from the switch icon 713, and the screen information can be transmitted to the external device according to the command icon. According to this embodiment, since an image on which the screen information changes can be transmitted to an external device in real time during a video call, the user may have information to resolve an error in time. real thanks to a third party. In addition, an icon to make an immediate call is displayed when an error occurs, and the user can easily seek help even if he does not know the control process at all. If we look at Fig. 8C, when entering password information to perform the specific function, the controller 180 displays a security entry screen 721 in the first region. While the security input screen 721 is displayed on the display unit 151 and the user enters the password information on the security input screen 721, the controller 180 restricts the security. transmitting the security input screen 721 and commanding the wireless communication unit 110 to transmit a predefined security image 72Γ. As a result, the security image 721 'is displayed in a region of the external device 100'. A restriction icon 721a for restricting transmission of the security input screen 721 may be displayed in the second region. The controller 180 can control the restriction of the transmission of the security entry screen 721 only when the security information is entered after a touch has been applied to the restriction icon 721a. When security is released in response to proper input of the security information, the controller 180 instructs the wireless communication unit 110 to transmit the screen information 722 to the external device 100 '. According to this embodiment, the user can selectively restrict the transmission of unwanted information, for example, security information and information that may cause an invasion of privacy. According to this embodiment, the user can immediately modify and store a preview image that is obtained in real time through a camera. The present invention can be implemented as computer readable codes on a program recording medium. The computer readable medium may include any type of recording device each storing data readable by a computer system. Examples of these computer readable media may include a hard disk, an SD disk, a silicon disk, a read only memory (ROM), a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk , an optical data storage element and the like. In addition, the computer readable medium may also be implemented in a carrier waveform (e.g., transmission over the Internet). The computer may include the controller 180 of the terminal. Therefore, it should be understood that the embodiments described above are not limited by any of the details of the foregoing description, unless otherwise indicated, but should be considered broadly in the scope of the latter as it stands. is defined in the appended claims, and therefore all changes and modifications which are inscribed in the ins and outs of the claims, or the equivalents of these ins and outs, are therefore intended to be encompassed in the appended claims.
权利要求:
Claims (10) [1] A mobile terminal comprising: a display device (151); and a controller (180) arranged for: displaying by the display device screen information (510) corresponding to a specific function being executed; recognize a user-caused error or problem that occurs during the execution of the specific function from a specific image, keyword, or term in the displayed screen information, or from a usage pattern that has been predefined as an indication of the error or problem; causing the display device to display guidance information to provide a solution or help for the error or problem; and causing the display device (151) to display information related to the solution or help in response to user input received while displaying the guidance information. [2] The terminal of claim 1, wherein the controller (180) is further adapted to determine that the error or problem occurred when a notification window (501) signaling the error is displayed or when the Screen information is displayed repeatedly according to a predefined template. [3] The terminal of claim 2, wherein: the controller (180) is further adapted to execute a specific application to search for data related to the error or problem; and the guidance information (603) is included in an execution screen of the specific application. [4] The terminal of claim 3, wherein the controller (180) is further adapted to search for data related to the error or problem by executing the specific application in response to a series of touch inputs received by the intermediate at least one of the notification window or information from [10] The terminal of claim 1, wherein the controller (180) is further adapted to display by the display (151) the information related to the solution or using sequentially for each control step. to solve the error or problem. [11] The terminal of claim 1, wherein the controller is further adapted to perform an error or problem resolution operation from the solution or help related information when a touch is applied to a device. application icon (651a) displayed along with the information. [12] The terminal of claim 1, further comprising a memory (170), wherein the controller is further adapted to store by the memory the changes of the screen information according to a time period in response to a control command applied through the display device (151) after receiving a touch via the guidance information. [13] The terminal of any one of claims 1 to 12, further comprising a wireless communication unit (110) configured to perform wireless communication with an external device, wherein the controller is further adapted to: perform a video call function with the external device; and having the wireless communication unit transmit the screen information to the external device when the problem occurs. [14] A method for controlling a mobile terminal, the method comprising: displaying screen information corresponding to a specific function running on a display device; recognition of a user-caused error or problem that occurs during the execution of the specific function from a specific image, keyword, or term in the information of the user screen displayed, or from a usage pattern that has been predefined as an indication of Terror or problem; displaying guidance information to provide a solution or help for the error or problem; and displaying information related to the solution or help in response to user input received while displaying the guidance information. [15] The method of claim 14, further comprising: extracting text from the guidance information when the guidance information comprises a screen information capturing image; and searching for data related to the error or problem using the extracted text as a search term.
类似技术:
公开号 | 公开日 | 专利标题 FR3021133B1|2019-08-30|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3022368B1|2019-06-21|WATCH-TYPE TERMINAL AND CONTROL METHOD THEREOF FR3031601B1|2019-08-30|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME US10372322B2|2019-08-06|Mobile terminal and method for controlling the same FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL EP3122024B1|2018-09-19|Mobile terminal and controlling method thereof FR3022367A1|2015-12-18| FR3041785A1|2017-03-31|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021136A1|2015-11-20|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021766A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3021134A1|2015-11-20|MOBILE TERMINAL US9565354B2|2017-02-07|Mobile terminal for refocusing of a captured image and method for controlling the same FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3026201A1|2016-03-25| FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3046470B1|2019-11-08|MOBILE TERMINAL FR3039673A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021135A1|2015-11-20| FR3019665A1|2015-10-09| FR3040096A1|2017-02-17|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021485A1|2015-11-27|MOBILE DEVICE AND METHOD OF CONTROLLING THE SAME FR3022648A1|2015-12-25| US20160054567A1|2016-02-25|Mobile terminal, glasses-type terminal, and mutual interworking method using screens thereof FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3042084B1|2019-11-08|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
同族专利:
公开号 | 公开日 EP3131274A1|2017-02-15| US10042689B2|2018-08-07| US20170046214A1|2017-02-16| KR20170020090A|2017-02-22| EP3131274B1|2019-01-23| CN106453817B|2019-10-18| CN106453817A|2017-02-22|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US6456303B1|1998-02-09|2002-09-24|Microsoft Corporation|Method and system for access of online information| US8095490B2|2006-03-27|2012-01-10|Panasonic Corporation|User support device, method, and program| US7738860B2|2006-05-31|2010-06-15|Sony Ericsson Mobile Communications Ab|Controlling access to wireless communications device features based on user proficiency| US8453017B2|2008-08-27|2013-05-28|Kyocera Document Solutions Inc.|Electronic device saving selected error information and an error management system including such a device| JP5010634B2|2009-03-17|2012-08-29|株式会社東芝|Mobile image tracking device| US20100241893A1|2009-03-18|2010-09-23|Eric Friedman|Interpretation and execution of a customizable database request using an extensible computer process and an available computing environment| US20110060835A1|2009-09-06|2011-03-10|Dorso Gregory|Communicating with a user device in a computer environment| TWI438675B|2010-04-30|2014-05-21|Ibm|Method, device and computer program product for providing a context-aware help content| US8826451B2|2010-08-16|2014-09-02|Salesforce.Com, Inc.|Mechanism for facilitating communication authentication between cloud applications and on-premise applications| US9262601B2|2011-09-10|2016-02-16|Rsupport Co., Ltd.|Method of blocking transmission of screen information of mobile communication terminal while performing remote control using registration of alert message in status bar| EP2666088B1|2011-04-07|2019-10-30|Siemens Healthcare Diagnostics Inc.|Methods for hierarchically identifying root cause errors| US9015536B1|2011-08-31|2015-04-21|Amazon Technologies, Inc.|Integration based anomaly detection service| KR101140418B1|2011-09-10|2012-05-03|알서포트 주식회사|Screen image interception method for mobile telephone on the remote control using status bar alert message| GB2507554A|2012-11-05|2014-05-07|Ibm|Encoding diagnostic data in an error message for a computer program| US9053025B2|2012-12-28|2015-06-09|Intel Corporation|Apparatus and method for fast failure handling of instructions| US9053216B1|2013-08-09|2015-06-09|Datto, Inc.|CPU register assisted virtual machine screenshot capture timing apparatuses, methods and systems| US9396089B2|2014-05-30|2016-07-19|Apple Inc.|Activity tracing diagnostic systems and methods|JP6745338B2|2016-05-10|2020-08-26|株式会社Nttドコモ|Judgment device and judgment system| CN109426428A|2017-08-31|2019-03-05|华为终端(东莞)有限公司|A kind of the operation guidance information providing method and terminal device of terminal| KR20190088242A|2018-01-18|2019-07-26|삼성전자주식회사|Electronic device and driving method of electronic device and electronic device| CN109974214A|2019-02-27|2019-07-05|珠海格力电器股份有限公司|Method, apparatus, system and the terminal of equipment regulation|
法律状态:
2017-06-28| PLFP| Fee payment|Year of fee payment: 2 | 2018-06-22| PLFP| Fee payment|Year of fee payment: 3 | 2018-08-31| PLSC| Search report ready|Effective date: 20180831 | 2020-04-10| RX| Complete rejection|Effective date: 20200305 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 KR1020150114814A|KR20170020090A|2015-08-13|2015-08-13|Mobile terminal and method of controlling the same| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|