专利摘要:
APPARATUS AND METHOD TO CONTROL MOVEMENT-BASED USER INTERFACE. A method and apparatus for controlling a motion-based user interface is provided, in which a touch screen displays an image and receives input from a user touch at at least one point, a sensor unit detects movement of the apparatus , a motion calculator calculates a degree of movement of the device, upon detecting the movement of the device in the sensor, and when at least one point is touched on the touchscreen, a controller determines the number of points touched, receives information about the degree of movement of the device from the motion calculator, and determines whether the image size or image position changes according to the number of touched points.
公开号:BR112013009161B1
申请号:R112013009161-4
申请日:2011-10-13
公开日:2020-11-24
发明作者:Hyun-Su Hong;Yung-Keun Jung;Jae-Myeon Lee;Il-hwan Kim
申请人:Samsung Electronics Co., Ltd;
IPC主号:
专利说明:

TECHNICAL FIELD
The present invention relates to a user interface, and, more particularly, to an apparatus and method for controlling an interface according to a movement that a mobile terminal has made. BACKGROUND OF THE TECHNIQUE
Currently, electronic devices such as televisions, MP3 players, portable multimedia players (PMPs), smart phones, etc. are equipped with a variety of input / output devices in order to allow a user to conveniently control the electronic device.
Among input / output devices, the use of smart phones has recently increased exponentially. In particular, the proportion of the use of touch panels is gradually increasing in the mobile terminal market including a portable phone, a smart phone, and a laptop. As touch screen panels are expected to gain more popularity on most smart phones, the market for touch screens for mobile terminals will rapidly increase. Touchscreen panels are also widely used in electronic devices, such as TVs or refrigerators. The electronic device market will be the second in the classification in the adoption of touch screen panels, following the mobile terminal market.
Recently, extensive research has been done on recognizing user intent and action based on visual information, for natural interaction between a user and a touchscreen. In particular, a friendly user interface is configured to recognize a gesture input by a finger or a touch pen. The trend of user interfaces is now shifting from an interface that operates according to a single touch input based on a single finger on a touchscreen to an interface that operates according to a multi-touch input based on multiple fingers on a touch screen.
The touchscreen is a stack of a plane for detecting an entry and a plane serving as a monitor. Therefore, the user's intention can be analyzed and perceived from a multi-touch input on the touchscreen and results of analysis and perception can be output on the touchscreen. In particular, a multi-touch user interface is designed in such a way that the number of finger taps / pen taps on the touch screen and an associated operation are recognized and an associated command is executed. The interior structure of a mobile terminal supporting multi-touch input will be described below.
Figure 1 is a block diagram of a conventional mobile terminal supporting multi-touch input. In Figure 1, the mobile terminal includes an output unit 101, a controller 103, a memory 105, and an input unit 107.
Referring to Figure 1, output unit 101 emits an image such as a drawing or a web page on a touchscreen. The image, such as a drawing or a web page is stored in memory 105. When the image is output on the touchscreen, the user can enlarge or shrink the image using the input unit 107 according to the user selection . A designated portion of the image displayed on the touchscreen can be enlarged or contracted by touching it with two fingers simultaneously. Instead of fingers, a touch pen can be used. After entering multiple taps through input unit 107, controller 103 controls output unit 101 to display the enlarged or reduced multitouch area on the touchscreen.
An exemplary operation performed by entering multiple touches on the mobile terminal having the configuration illustrated in Figure 1 will be described below.
Figures 2A and 2B illustrate an exemplary conventional operation performed by entering multiple taps. The operation is specifically to enlarge a specific area by multiple taps on a touchscreen.
Referring to Figure 2A, a user touches the coke-sensitive screen 201 with two fingers 205 in a compression motion to enlarge a car image 203 displayed on a touchscreen 201. The enlarged area can be different depending on a position where the two fingers 205 touch. Referring to Figure 2B, the car image 203 is enlarged by spreading the two fingers 205 on the touchscreen 201. The degree to which the car image 203 is enlarged may depend on the distance between the two fingers 205 spread out from each other.
Although only the operation of enlarging an image is illustrated in Figures 2A and 2B, the enlarged image can also be reduced using two fingers 205.
Conventionally, the zoom in or zoom out ratio of a predetermined part of the touchscreen can be adjusted just by pressing a preset zoom or zoom key or making multiple taps. When a user wants to go from one page to another page or from one part to another part on the same page, the user must press a predefined movement key or tap and drag an area of the touchscreen. This means that for more zoom and less continuous zoom, the user must make consecutive key entries or continuous multiple taps. In addition, to move from one specific part to another part on the same page, the user must inconveniently enter keys in succession or make continuous tap-and-drag. For this reason, it is difficult to perform a user-intended operation quickly and precisely with an entry on the touchscreen or a gesture drawn on the touchscreen. DISCLOSURE OF THE INVENTION TECHNICAL PROBLEM
One aspect of the modalities of the present invention is to solve at least the problems and / or disadvantages and to provide at least the advantages described below. Therefore, an aspect of the modalities of the present invention is to provide an apparatus and a method for controlling a User interface so that an image can be enlarged, shrunk, and moved based on a movement. SOLUTION TO THE PROBLEM
In accordance with an embodiment of the present invention, an apparatus is provided to control a motion-based user interface, in which a touch screen displays an image and receives input from a user touch in at least one point, one unit sensor detects a movement of the device, a motion calculator calculates a degree of movement of the device, upon detecting the movement of the device in the sensor unit, and when at least one point is touched on the touchscreen, a controller determines the number of points touched, receives information about the degree of movement of the device from the movement calculator, and determines whether to change the image size or the position of the image according to the number of points touched.
In accordance with another embodiment of the present invention, a method is provided for controlling a motion-based user interface, in which an image is displayed on the touchscreen, the number of points touched is determined, when at least one point is touched on the touchscreen, it is determined whether to change the size of the image or the position of the image according to the number of points touched, and the size of the image or the position of the image is changed according to the movement of a device including to the touch screen. ADVANTAGE EFFECTS OF THE INVENTION
According to the apparatus and method of the present invention, the user can easily zoom in, zoom out, and move an image through movement. BRIEF DESCRIPTION OF THE DRAWINGS
The objects, characteristics and advantages above and others of the determined modalities of the present invention will be more evident from the following detailed description taken together with the attached drawings, in which:
Figure 1 is a block diagram of a conventional mobile terminal supporting multiple touch input
Figures 2A and 2B illustrate an exemplary conventional operation that is performed based on a multi-touch input
Figure 3 is a block diagram of a mobile terminal for controlling a motion-based user interface according to an embodiment of the present invention,
Figures 4A to 5C are exemplary views illustrating movements of the mobile terminal in accordance with the modality of the present invention,
Figures 6A and 6B are exemplary views illustrating an operation to enlarge or shrink an image according to an embodiment of the present invention;
Figures 7A and 7B are exemplary views illustrating an operation for moving an image according to an embodiment of the present invention;
Figures 8A and 8B are exemplary views illustrating operations for controlling a motion-based user interface using the predetermined keys according to an embodiment of the present invention; and
Figure 9 is a flow chart illustrating an operation for controlling a motion-based user interface according to an embodiment of the present invention.
Throughout the drawings, the same drawing reference numerals will be understood to refer to the same elements, characteristics and structures. MODE FOR THE INVENTION
Reference will now be made in detail to the preferred embodiments of the present invention, with reference to the accompanying drawings. Although specific details such as the components are described in the description that follows, they are provided to aid complete understanding of the present invention and it is clearly understood by those skilled in the art that variations and modifications can be made to the details within the scope and spirit of the present invention.
Figure 3 is a block diagram of a mobile terminal for controlling a motion-based user interface according to an embodiment of the present invention. In Figure 3, the mobile terminal includes an output unit 301, a controller 303, a memory 305, an input unit 307, a motion calculator 309, and a sensor unit 311.
Referring to Figure 3, output unit 301 outputs an image, such as a photo or a web page on a touchscreen. The image, such as a photo or a web page, is stored in memory 305. When necessary, another image stored in memory 305 can be used or a desired photo can be captured using a camera module equipped on the mobile terminal.
Input unit 307 receives information about the number of points touched on the touchscreen, a drag made on the touchscreen, etc. The motion calculator 309 calculates the direction of movement or angle of the handset and transmits information about the calculated direction of movement or angle to the controller 303. The handset can basically move up, down, left and on the right. In addition, the mobile terminal can move forward in the horizontal direction towards the user or back in the horizontal direction away from the user. The angle at which the handset is moved is calculated to be an angle at which the handset is tilted with respect to a current position defined as 0 degrees. The motion calculator 309 uses sensor unit 3.11 to calculate the direction of movement or angle of the handset. More specifically, the motion calculator 309 calculates a direction or angle in which the handset moved through a gyroscope sensor between one or more sensors in the 311 sensor unit. The gyroscope sensor is a sensor adapted to detect rotations in addition to the functions of an acceleration sensor and therefore follow the movements in six axes. Therefore, compared to the acceleration sensor, the gyroscope sensor can detect more precise movements. The gyroscope sensor detects height and tilt of rotation as well as acceleration and deceleration.
The controller 303 receives information about the number of touched points and information about a touch and drag on the touchscreen from the input unit 307, receives information about a calculated movement of the handset from the movement calculator 309, and controls image displayed on the touchscreen in a different way according to the number of points touched.
If two or more points are touched in any area of the touch screen, the 303 controller determines the input of multiple touches to be a command to enlarge or decrease an image displayed on the touch screen and increases or decreases the image according to information about a change in the movement of the handset received from the motion calculator 309. For example, when two points are touched in any area of the touchscreen and the handset is brought close to the user at an unchanged angle, or the touchscreen of the handset is tilted forward with respect to the user, the image is enlarged. On the other hand, when the handset moves away from the user at an unchanged angle or the touchscreen of the handset is reclined backwards from the user, the image is shrunk. This method has the same effects obtained in the conventional multi-touch scheme in which an image is enlarged or reduced with two fingers, which will be evident from the description of Figures 4A, 4B and 4C.
Figures 4A, 4B and 4C are exemplary views illustrating movements of the mobile terminal in accordance with an embodiment of the present invention. In Figures 4A, 4B and 4C, it is assumed that a user touches two points on the touchscreen.
Referring to Figure 4A, when the user pulls the handset in the Z axis direction with the current angle of the handset unchanged, facing the touchscreen of the handset, when touching any two points on the touchscreen of the handset , a displayed image is enlarged. On the contrary, when the user pushes the handset in the direction of the Z axis, facing the touchscreen of the handset, the displayed image is reduced. Figure 4B is a side view of the mobile terminal, which illustrates a path on which the mobile terminal is moved. Note from Figure 4B that the handset is moved only in the direction of Z axis or Z axis without a change in its angle. Figure 4C illustrates a variation in the inclination of the mobile terminal, with the Y and -Y axes defined as an axis of rotation. An image displayed on the touchscreen can be increased or decreased by tilting the handset forward or backward in relation to the user.
If the user touches a point on any area of the touchscreen, the 303 controller determines the input of a single touch to be a command to move an image displayed on the touchscreen and moves the image up, down, to to the left, or to the right according to information about a change in the movement of the handset received from the motion calculator 309. For example, if the user moves the handset to the left or rotates the handset to the left , while touching a point in any area of the touchscreen, a displayed image is moved to the left. That is, the same effect as with a scheme to move an image »in a direction intended to touch and drag an area of the touchscreen is achieved, which will be evident from Figure 5.
Figures 5A, 5B and 5C illustrate movements of the mobile terminal in accordance with an embodiment of the present invention. In Figures 5A, 5B and 5C, it is assumed that the user touches a point on the touchscreen.
Referring to Figure 5A, when the user moves the handset in the direction of the X axis, -X axis, Y axis or Y axis, by touching a point on the touch screen of the mobile terminal with the current angle of the handset unchanged, an image displayed on the touchscreen moves in the direction the handset is moved. Figure 5B illustrates changes in the rotation of the mobile terminal, with Z and -2 axes defined as a rotation axis and Figure 5C illustrates changes in the rotation of the mobile terminal, with Y and -Y axes defined as a rotation axis. If the Z and -Z axes are defined as a rotation axis and the handset is rotated to the right or left on the Y and -Y axes, the displayed image can move laterally. On the other hand, if the Y and -Y axes are defined as an axis of rotation and the touchscreen of the handset is tilted or reclined on the Y and -Y axes, the displayed image can move up or down.
If the Y and * -Y axes are defined as a rotation axis, the displayed image can be transformed differently according to the number of points touched on the touchscreen. In Figure 4C, the image is enlarged or reduced, while in Figure 5C, the image is shifted up or down.
As is conventionally done, controller 303 can also control an image displayed on the touchscreen in correspondence with a gesture such as a drag that can occur after a touch of at least one point on any area of the touchscreen. For example, if a point is touched in any area of the touchscreen and the point touched is dragged to the left, the screen image can be moved to the left or the next image can be displayed. If five dots are touched on the touchscreen, the image can be enlarged or reduced according to a conventional multi-touch scheme.
Now, a description will be given of an operation to control a displayed image, with reference to Figure 3.
Figures 6A and 6B are exemplary views illustrating an operation for enlarging or decreasing an image according to a fashion of the present invention.
Referring to Figure 6A, the user pulls the handset, grabbing the touchscreen so that two points on any area of the touchscreen are touched, while viewing the touchscreen. The displayed image is enlarged in correspondence with the pulling movement. Figure 6B is a side view illustrating the user pulling the handset. Note that the image is enlarged simply by bringing the handset closer to the user, as shown in Figures 4A and 4B, without tilting the handset forward in relation to the user.
Although not shown in Figures 6A and 6B, when the user pushes the handset backwards, grabbing the touchscreen so that two points on any area of the touchscreen are touched, while viewing the touchscreen, the displayed image can be reduced. The function of enlarging an image by pulling the handset and shrinking an image by pushing the handset can be modified by a user configuration. For example, the function can be changed in such a way that an image is reduced when pulling the handset and enlarged when pushing the handset,
The zoom-in or zoom-out ratio of an image can be configured to be proportional to a measurement of rotation angle in each axis direction illustrated in Figure 4A to 5C, using the gyroscope sensor included in the 311 sensor unit of the handset shifted according to a user movement. In the event that this zoom in or zoom out function is available, the user can enlarge or reduce an image according to the conventional multi-touch scheme. During the conventional zoom-based zoom or zoom-out function in progress, the zoom-in or zoom-out function of the motion-based image can be disabled to avoid mutual interference between the proposed zoom and zoom-out function.
If only one point is touched or no touch is made on the touchscreen, the operation of zooming in or out of image zoom is disabled in the invention.
Figures 7A and 7B are exemplary views illustrating an operation for moving an image according to an embodiment of the present invention.
Referring to Figure 7A, the user moves the handset from left to right, viewing the touchscreen, while grasping the handset so that a dot is touched on any area of the touchscreen. An image displayed on the touchscreen moves from the right to the left and πi. correspondence with the movement of the mobile terminal.
Referring to Figure 7B focusing on the touchscreen, the left touchscreen displays an image before the movement, and the right touchscreen shows an image output when the handset moves from left to right . As the image moves, an image hidden on the right side outside the screen appears on the screen. In the case where a plurality of images are arranged, the images can be shown sequentially on the screen by moving the mobile terminal.
The distance by which an image moves can be configured to be proportional to the measurement of the angle of rotation in each axis direction illustrated in Figures 4A to 5C, using the gyroscope sensor included in the sensor unit 311 of the mobile terminal moved according to a user movement. In the situation where this image shift function is available, the user can move an image by touching a point and dragging the touch on the touchscreen. During the drag function in progress, the motion-based shift function of the image can be disabled to prevent mutual interference between the drag function and the image shift function of the present invention.
If only one point is placed or no touch is made on the touchscreen, the image shift operation is disabled in the invention.
While it is described according to the present invention that a displayed image can be controlled in a different way according to the number of points touched in any area of the touchscreen, the same can be achieved by assigning specific keys and controlling the image displayed according to the number of keys pressed, which will be described below.
Figures 8A and 8B are exemplary views illustrating an operation to control a motion-based user interface using specific keys assigned according to an embodiment of the present invention.
Referring to Figure 8A, when the user presses digit buttons 1 and 3 at the same time or at a predetermined interval on a keyboard during the execution of an image viewer, the mobile terminal can be configured to recognize the movement as a command to increase or decrease a displayed image. According to a subsequent movement of the handset, the displayed image can be enlarged or reduced.
Referring to Figure 8B, when the user presses one of the digit 1 and 3 buttons on the keyboard while the image viewer is running, the handset can be configured to recognize motion as a command for moving a displayed image. Meanwhile, the keyboard can be replaced with buttons that operate from user touches.
Figure 9 is a flowchart illustrating an operation for concreting a motion-based user interface according to an embodiment of the present invention.
Referring to Figure 9, after receiving a user input, the mobile terminal runs an image viewer to display an image, such as a photo or web page in step 901. The image is stored in memory 305. If the mobile terminal is enabled with wireless internet, it can display a web page without running the image viewer. In 903, the handset determines whether a touch input was detected through input unit 307.If a touch input was not detected, the handset continues to run the image viewer in step 901.After detection of an input the handset moves to step 905.
In step 905, the handset counts the number of points touched on the touchscreen and controls an image displayed differently according to the number of points touched. If two or more points have been touched, the handset determines to enlarge or decrease the image and begins to detect movement of the handset. In step 907, if the mobile terminal moves forward or backward or is tilted forward in relation to the user or reclined backward as shown in Figures 4A, 4B and 4C, the mobile terminal calculates a degree to which the mobile terminal moves to up, down, left, right, back, or forward, or a degree to which the device is tilted forward, reclined backward, rotated to the left, or rotated to the right, using a gyroscope included in the sensor unit and increases or decreases the image in correspondence with the calculated degree. The movement of the mobile terminal is detected by the gyroscope sensor of the sensor unit 311 and the distance the mobile terminal has moved or the degree to which the mobile terminal has been tilted is calculated by the motion calculator 309. In addition, two or more buttons can be assigned and pressed instead of a colon to enlarge or reduce the image.
If a single point has been touched, the handset determines to move the image and begins to detect movement of the handset. When the handset moves up, down, left or right, or is tilted forward, reclined backwards, rotated to the left, or rotated to the right, as shown in Figures 5A, 5B and 5C, the distance moved or the slope or degree of rotation of the handset is calculated and the image is shifted in a direction corresponding to the calculated degree. As indicated above, the movement of the handset is detected by the gyroscope sensor of the sensor unit 311 and the distance the handset has moved or the degree to which the handset has been tilted is calculated by the motion calculator 309. In addition In addition, a specific button can be assigned and pressed instead of a point to shift the image.
Although the present invention has been particularly shown and described with reference to modalities thereof, it should be understood by those skilled in the art that various changes in shape and details can be made without departing from the spirit and scope of the present invention, as defined by the claims that follow.
权利要求:
Claims (6)
[0001]
1. Apparatus for controlling a motion-based user interface, the apparatus characterized by the fact that it comprises: a touch screen (307) configured to display an image and receive input from a user touch at at least one point; a sensor unit (311) configured to detect movement of the apparatus; and a controller (303) configured for: when at least one point is touched on the touchscreen, it determines a number of points touched; and obtaining information corresponding to the movement of the apparatus, in which the controller (303) is further configured to determine whether the number of touched points are two or more, whether the touched points move, and whether the movement of the apparatus is detected, in which if the number of touched points is two or more, the touched points do not move, and the movement of the device is detected, the controller (303) is further configured to: display an image on the touchscreen by zooming out the image based on determination that the information indicates that the device moved away from the user; or display the image on the touchscreen by zooming in on the image based on the determination that the information indicates that the device has moved closer to the user, and that if the number of points touched is two or more and the points touched are move, the controller is also configured to disable the zoom function based on movement, and display an image on the touchscreen by zooming in or out based on the movement of the touched points.
[0002]
2. Apparatus, according to claim 1, characterized by the fact that if the number of points touched is 1, the controller (303) is configured to recognize the user's touch corresponding to the points touched as a command for moving the image , obtain information related to the movement of the device to move the image on the touchscreen, and move the image based on the information obtained for moving the image.
[0003]
3. Apparatus according to claim 1, characterized by the fact that it still comprises: a motion calculator (309) that is configured to calculate: a degree of movement of the apparatus including a degree in which the apparatus moves upwards, to down, left, right, back, or forward using an acceleration sensor included in the sensor unit (311); or a degree to which the apparatus is tilted forward, reclined backward, rotated to the left, or rotated to the right using a gyro sensor included in the sensor unit, in which the controller (303) is configured to determine the information related to the movement of the device based on the degree of movement of the device received from the motion calculator (309).
[0004]
4. Method for controlling a motion-based user interface, the method characterized by the fact that it comprises: displaying an image on a touchscreen (901); detecting a movement of the device in a sensor unit (903); determine a number of points touched, when at least one point is touched on the touchscreen (905); and determine if the touched points move, in which if the number of touch points are two or more, the touch points do not move, and the movement of the device is detected, the method still comprises: displaying an image on the sensitive screen to the touch, zooming out of the image based on a determination that the information corresponding to the movement of the device indicates that the device moves away from the user; or display the image on the touchscreen by zooming the image based on a determination that the information corresponding to the movement of the device indicates that the device moves closer to the user (907), and that if the number of points touched are two or more and the touched points move, the method also includes disabling the zoom function based on movement, and displaying an image on the touch screen by zooming in or out based on the movement of the touched points .
[0005]
5. Method, according to claim 4, characterized by the fact that it still comprises: if the number of points touched is one, it recognizes the user's touch corresponding to the points touched as a command to dislocate the image; obtain information related to the movement of the device to move the image on the touchscreen; and shift the image based on the information obtained to shift the image.
[0006]
6. Method, according to claim 4, characterized by the fact that the information related to the movement of the device is determined based on a degree of movement of the device that is measured by calculating a degree in which the device moves upwards , down, left, right, back, or forward, or a degree to which the device is tilted forward, backward, left, or right.
类似技术:
公开号 | 公开日 | 专利标题
BR112013009161B1|2020-11-24|APPARATUS AND METHOD TO CONTROL MOVEMENT-BASED USER INTERFACE
US9519350B2|2016-12-13|Interface controlling apparatus and method using force
US8350822B2|2013-01-08|Touch pad operable with multi-objects and method of operating same
KR101948665B1|2019-02-18|Flexible apparatus and control method thereof
EP3014388B1|2019-05-01|Portable device and control method thereof
TW201019179A|2010-05-16|Touch panel and quick scrolling method thereof
US10671269B2|2020-06-02|Electronic device with large-size display screen, system and method for controlling display screen
KR20130124139A|2013-11-13|Control method of terminal by using spatial interaction
KR20100136289A|2010-12-28|A display controlling method for a mobile terminal
KR20170012628A|2017-02-03|Rollable display apparatus and control method thereof
US20130328934A1|2013-12-12|Display device and control method thereof
CN104063046B|2017-04-12|Input Device And Method Of Switching Input Mode Thereof
KR101436587B1|2014-09-03|Method for providing user interface using two point touch, and apparatus therefor
KR101436588B1|2014-09-02|Method for providing user interface using one point touch, and apparatus therefor
AU2017210607B2|2019-03-21|Apparatus and method for controlling motion-based user interface
KR102049259B1|2019-11-28|Apparatus and method for controlling user interface based motion
JP5681013B2|2015-03-04|Electronic device and control method thereof
KR101165388B1|2012-07-12|Method for controlling screen using different kind of input devices and terminal unit thereof
KR101436586B1|2014-09-02|Method for providing user interface using one point touch, and apparatus therefor
WO2021160276A1|2021-08-19|Rolling gesture and mistouch prevention on rolling devices
同族专利:
公开号 | 公开日
JP5828000B2|2015-12-02|
CN103168282A|2013-06-19|
WO2012050377A2|2012-04-19|
AU2011314532A1|2013-04-18|
RU2541852C2|2015-02-20|
KR101915615B1|2019-01-07|
EP2628067B1|2019-04-17|
JP2015215907A|2015-12-03|
KR20120038788A|2012-04-24|
BR112013009161A2|2016-07-26|
EP2628067A4|2016-08-31|
CN107066045A|2017-08-18|
EP3543832B1|2021-10-06|
JP2013541104A|2013-11-07|
US20170178284A1|2017-06-22|
WO2012050377A3|2012-06-14|
US9588613B2|2017-03-07|
US20130194222A1|2013-08-01|
RU2013121805A|2014-11-20|
AU2011314532B2|2015-08-20|
EP2628067A2|2013-08-21|
EP3543832A1|2019-09-25|
US10360655B2|2019-07-23|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

JP3234633B2|1992-06-19|2001-12-04|シャープ株式会社|Information processing device|
US7466843B2|2000-07-07|2008-12-16|Pryor Timothy R|Multi-functional control and entertainment systems|
JPH07271505A|1994-03-29|1995-10-20|Toshiba Corp|Image display controller|
US9292111B2|1998-01-26|2016-03-22|Apple Inc.|Gesturing with a multipoint sensing device|
US6317114B1|1999-01-29|2001-11-13|International Business Machines Corporation|Method and apparatus for image stabilization in display device|
US6861946B2|2000-05-17|2005-03-01|Caveo Technology Llc.|Motion-based input system for handheld devices|
GB0004351D0|2000-02-25|2000-04-12|Secr Defence|Illumination and imaging devices and methods|
JP4635342B2|2001-01-10|2011-02-23|ソニー株式会社|Information processing terminal and method|
US6798429B2|2001-03-29|2004-09-28|Intel Corporation|Intuitive mobile device interface to virtual spaces|
US6834249B2|2001-03-29|2004-12-21|Arraycomm, Inc.|Method and apparatus for controlling a computing system|
US8462109B2|2007-01-05|2013-06-11|Invensense, Inc.|Controlling and accessing content using motion processing on mobile devices|
GB2416702B|2004-04-20|2008-08-13|Robert James Clements|Team sport apparatus|
TW200622893A|2004-07-09|2006-07-01|Nokia Corp|Cute user interface|
KR100984596B1|2004-07-30|2010-09-30|애플 인크.|Gestures for touch sensitive input devices|
US8479122B2|2004-07-30|2013-07-02|Apple Inc.|Gestures for touch sensitive input devices|
EP1805579A1|2004-09-14|2007-07-11|Nokia Corporation|A method for using a pointing device|
JP2006203463A|2005-01-19|2006-08-03|Matsushita Electric Ind Co Ltd|Image display method, image pick-up method, and image synthesizing method|
US20060164382A1|2005-01-25|2006-07-27|Technology Licensing Company, Inc.|Image manipulation in response to a movement of a display|
US20070177804A1|2006-01-30|2007-08-02|Apple Computer, Inc.|Multi-touch gesture dictionary|
KR100753397B1|2006-04-04|2007-08-30|삼성전자주식회사|Apparatus and method for controlling auto display in a mobile station|
US20070279755A1|2006-06-01|2007-12-06|3M Innovative Properties Company|Head-Up Display System|
US8514066B2|2006-11-25|2013-08-20|Trimble Navigation Limited|Accelerometer based extended display|
US10437459B2|2007-01-07|2019-10-08|Apple Inc.|Multitouch data fusion|
JP4341680B2|2007-01-22|2009-10-07|セイコーエプソン株式会社|projector|
US7946744B2|2007-02-02|2011-05-24|Denso Corporation|Projector and image pickup apparatus|
US7573382B2|2007-04-02|2009-08-11|General Electric Company|System and method to manage movement of assets|
US20080280642A1|2007-05-11|2008-11-13|Sony Ericsson Mobile Communications Ab|Intelligent control of user interface according to movement|
US20080284739A1|2007-05-17|2008-11-20|Microsoft Corporation|Human Interface Device|
KR101008361B1|2007-09-06|2011-01-14|마이크로인피니티|Apparatus and Method for Controlling Display Devices|
US8942764B2|2007-10-01|2015-01-27|Apple Inc.|Personal media device controlled via user initiated movements utilizing movement based interfaces|
US20090138800A1|2007-11-23|2009-05-28|Mckesson Financial Holdings Limited|Apparatus, method and computer-readable storage medium for directing operation of a software application via a touch-sensitive surface|
JP4557058B2|2007-12-07|2010-10-06|ソニー株式会社|Information display terminal, information display method, and program|
KR20090093766A|2008-02-28|2009-09-02|황재엽|Device and method to display fingerboard of mobile virtual guitar|
KR101445607B1|2008-06-10|2014-09-29|삼성전자주식회사|device of processing digital image using a accelerator sensor and image replaying method using the same|
KR101505198B1|2008-08-18|2015-03-23|엘지전자 주식회사|PORTABLE TERMINAL and DRIVING METHOD OF THE SAME|
JP4720879B2|2008-08-29|2011-07-13|ソニー株式会社|Information processing apparatus and information processing method|
JP5279646B2|2008-09-03|2013-09-04|キヤノン株式会社|Information processing apparatus, operation method thereof, and program|
US8866840B2|2008-09-08|2014-10-21|Qualcomm Incorporated|Sending a parameter based on screen size or screen resolution of a multi-panel electronic device to a server|
KR101503835B1|2008-10-13|2015-03-18|삼성전자주식회사|Apparatus and method for object management using multi-touch|
US8558803B2|2008-11-28|2013-10-15|Samsung Electronics Co., Ltd.|Input device for portable terminal and method thereof|
US8159465B2|2008-12-19|2012-04-17|Verizon Patent And Licensing Inc.|Zooming techniques for touch screens|
WO2010076772A2|2008-12-30|2010-07-08|France Telecom|User interface to provide enhanced control of an application program|
US9069398B1|2009-01-30|2015-06-30|Cellco Partnership|Electronic device having a touch panel display and a method for operating the same|
JP5058187B2|2009-02-05|2012-10-24|シャープ株式会社|Portable information terminal|
US8698736B2|2009-03-24|2014-04-15|Immersion Corporation|Handheld computer interface with haptic feedback|
JP2011028635A|2009-07-28|2011-02-10|Sony Corp|Display control apparatus, display control method and computer program|
US20110087988A1|2009-10-12|2011-04-14|Johnson Controls Technology Company|Graphical control elements for building management systems|
JP5602618B2|2010-06-10|2014-10-08|任天堂株式会社|Image processing program, image processing apparatus, image processing system, and image processing method|
US8977987B1|2010-06-14|2015-03-10|Google Inc.|Motion-based interface control on computing device|
US8542218B2|2010-08-19|2013-09-24|Hyundai Motor Company|Electronic switch apparatus for vehicle|
KR101705119B1|2010-09-15|2017-02-10|삼성전자 주식회사|a electronic device and a method for constructing 3D screen using 2D images in the electronic device|
US8872854B1|2011-03-24|2014-10-28|David A. Levitt|Methods for real-time navigation and display of virtual worlds|
US9595083B1|2013-04-16|2017-03-14|Lockheed Martin Corporation|Method and apparatus for image producing with predictions of future positions|US11068049B2|2012-03-23|2021-07-20|Microsoft Technology Licensing, Llc|Light guide display and field of view|
CN103389849B|2012-05-07|2018-10-16|腾讯科技(北京)有限公司|A kind of image presentation method, system and mobile terminal based on mobile terminal|
JP2014078823A|2012-10-10|2014-05-01|Nec Saitama Ltd|Portable electronic apparatus, and control method and program of the same|
KR101990036B1|2012-10-31|2019-06-17|엘지전자 주식회사|Mobile terminal and control method thereof|
CN105453016B|2013-08-12|2019-08-02|苹果公司|In response to the context-sensitive movement of touch input|
US9110561B2|2013-08-12|2015-08-18|Apple Inc.|Context sensitive actions|
JP6195361B2|2013-08-21|2017-09-13|キヤノン株式会社|Image processing apparatus, control method, and program|
CN103455257A|2013-08-30|2013-12-18|华为技术有限公司|Implementation method of electronic amplifier and user equipment|
CN103699322A|2013-12-19|2014-04-02|北京百纳威尔科技有限公司|Image scaling treatment method and device|
CN104951218B|2014-03-25|2019-03-05|宏碁股份有限公司|Running gear and the method for adjusting form size|
IN2014DE00899A|2014-03-28|2015-10-02|Samsung Electronics Co Ltd|
EP3130994A4|2014-04-07|2018-01-03|Sony Corporation|Display control device, display control method, and program|
KR20150124710A|2014-04-29|2015-11-06|엘지전자 주식회사|Mobile terminal case and mobile terminal|
CN104216634A|2014-08-27|2014-12-17|小米科技有限责任公司|Method and device for displaying manuscript|
US9746930B2|2015-03-26|2017-08-29|General Electric Company|Detection and usability of personal electronic devices for field engineers|
JP6841647B2|2016-12-20|2021-03-10|シャープ株式会社|Display device, display method and program|
EP3764911A4|2018-03-16|2022-02-16|Echonous Inc|Systems and methods for motion-based control of ultrasound images|
US10852853B2|2018-06-28|2020-12-01|Dell Products L.P.|Information handling system touch device with visually interactive region|
US10664101B2|2018-06-28|2020-05-26|Dell Products L.P.|Information handling system touch device false touch detection and mitigation|
法律状态:
2018-12-26| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2019-10-01| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-07-28| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2020-11-24| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 13/10/2011, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
KR1020100100435A|KR101915615B1|2010-10-14|2010-10-14|Apparatus and method for controlling user interface based motion|
KR10-2010-0100435|2010-10-14|
PCT/KR2011/007628|WO2012050377A2|2010-10-14|2011-10-13|Apparatus and method for controlling motion-based user interface|
[返回顶部]