![]() MOBILE VIEWING DEVICE, A METHOD TO ALLOW A USER TO OBTAIN BOTH A THREE-DIMENSIONAL CONTENT AND A TWO
专利摘要:
mobile viewing device. The present invention relates to a mobile display device (100) for enabling a user to obtain both a three-dimensional [3d] and a two-dimensional [2d] view of content, comprising a 3d screen (120) to allow viewing of content in 3d, a 2d screen (140) to enable viewing of the content in 2d, a view processor to display the content as 3d content (124) on the 3d screen, and to display the content as 2d content (144) on the screen 2d, the 3d screen and the 2d screen being arranged on opposite faces (122, 142) of the device to enable the user to obtain either the 3d content view or the 2d view by rotating 102 the device in overturn. 公开号:BR112014016867B1 申请号:R112014016867-9 申请日:2012-12-21 公开日:2021-09-08 发明作者:Waltherus Antonius Hendrikus Roelen;Hans Zuidema;Abraham Karel Riemens 申请人:Ultra-D Coöperatief U.A; IPC主号:
专利说明:
Field of Invention [001] The invention relates to a mobile display device and a method to enable a user to obtain either a three-dimensional [3D] (for example, more commonly referred to as 3D) or two-dimensional [2D] (for example) display content , more commonly referred to as 2D). The invention further relates to a tablet device, a smartphone and a digital picture frame being composed of the mobile display device, and a computer program product for carrying out the method. [002] Mobile display devices are portable form factor devices that comprise a screen to display content to a user. Such devices often include batteries to be operational without having to be plugged into an electrical outlet. Examples of these types of mobile display devices are smart phones such as the Apple iPhone and tablet devices such as the Apple iPad or the Samsung Galaxy Tab. Some types of mobile display devices do not understand batteries and therefore they need to be plugged into a power outlet to make them operational, for example, digital photo frames, but nevertheless have a portable form factor to allow, for example, easy placement within a home or location. Work. [003] The aforementioned devices have in common that they provide the user access to one or several types of content. This content is often visual content, and as such, mobile viewing devices often comprise a screen that is large with respect to the devices themselves in order to display as best as possible. movies, internet content, games, photos, etc. Invention History [004] An internet publication, http:www.pocket-lint.com/News/40223/asus-announces-eeepad-3d-details, as obtained on August 8, 2011, details a planned ASUS-made tablet device. According to the publication, the tablet device is comprised of a 3D screen, and notes that, "instead of working fully in 3D, something that will presumably put a heavy drain on your eyes, the device has a switchable 3D mode." Summary of the Invention [005] A problem of the above mentioned tablet device having a switchable 3D mode is that the quality of the switchable 3D mode is insufficient. [006] It would be advantageous to have an improved mobile viewing device or method to enable a user to view both 2D and 3D content. [007] To better address this concern, a first aspect of the invention provides a mobile display device to enable a user to obtain a view of both three-dimensional [3D] and two-dimensional [2D] content, consisting of a 3D screen to allow viewing 3D content display, a 2D screen to enable 2D preview of the content, a visualization processor to display the content as 3D content on the 3D screen and to display the content as 2D content on the 2D screen, the 3D screen and the 2D screen being arranged on opposite faces of the device to enable the user to get both a 2D view of the content and a 3D view by rotating the device upside down. [008] In another aspect of the invention, a tablet or smartphone device comprising the established mobile display device is provided. In another aspect of the invention, a digital photo frame is provided which comprises the established mobile display device. [009] The above mentioned measurements allow a user to get a 2D or 3D content preview on a mobile viewing device. For this purpose, the device comprises a 3D display that allows the user to perceive the content in 3D, that is, through stereoscopy, involving enabling each of the user's eyes to perceive a slightly different view of the content and, from this mode, providing an impression of depth to the user. In addition to viewing in 3D, the device consists of a 2D screen on which the user can perceive the content in 2D, that is, without providing stereoscopy. The device is also composed of a visualization processor that is capable of allowing the visualization of the 3D content on the 3D screen and the visualization of the 2D content on the 2D screen. Content can be any content that can be viewable on a mobile device, such as, for example, movies, internet content, photos, games, apps, etc. The 3D screen and the 2D screen are arranged on the external surfaces of the device that are opposite to each other, that is, both surfaces are facing away from each other. Consequently, the display directions of the 3D screen and the 2D screen, as defined by, for example, the normal vectors of the viewing surfaces of those screens, are also facing away from each other. [0010] The above measures have the effect of a mobile display device that is provided consisting of two screens that are arranged on opposite sides of the device, with one of the screens being a 3D screen and the other being a 2D screen to enable the user to get either a 3D view of the content or a 2D view by rotating the device inverted, that is, from one face to the other, the faces that are placed in opposition. As a result, a user can decide between viewing the content in 3D and viewing it in 2D by simply rotating the mobile viewing device. The present invention is partly based on the recognition that front-to-back rotation of the device is an intuitive way to switch between 3D and 2D content viewing. Advantageously, when viewing content on the 3D screen, the user is not distracted by the 2D screen, or vice versa, as it is hidden from view on the opposite face of the device. Advantageously, the size of the 3D screen is not limited by the presence of the 2D screen, or vice versa, considering that the two screens are arranged on opposite sides of the device. Advantageously, each screen can be specifically optimized for 3D or 2D viewing, rather than, when using a 3D screen, having to compromise 2D viewing on it. [0011] As an option, the device is still composed of guidance means for generating guidance data indicative of an orientation of the device, and the screen processor is prepared to, depending on the guidance data, display the content both on the screen 3D as on 2D screen. Optionally, a method is provided to enable a user to obtain either a 3D 3D or 2D 2D content view using a mobile display device, the mobile display device consisting of a 3D screen to allow viewing of the content in 3D , a 2D screen to enable viewing of the 2D content, and the 3D screen and the 2D screen being arranged on opposite faces of the device, and the method comprising generating orientation data indicative of an orientation of the device, and displaying, on the dependency on orientation data, the content both as a 3D content on the 3D screen and as a 2D content on the 2D screen, to enable the user to get the 3D view or the 2D content view by rotating the device inverted. [0012] The above measurements result in the device orientation which affects whether the content is displayed on the 3D screen or on the 2D screen. Device orientation usually determines a visibility of any of the screens to the user, for example it may be a result of the user having oriented the device to specifically display one of the screens. In these cases, the screen organized in opposition is not visible to the user, nor is its content. Guidance data is therefore used to determine which of the two screens the content is displayed on. [0013] Advantageously, by displaying the content only on any of said screens, the energy consumption of the other screen, i.e., where the content is not displayed, is reduced. [0014] As an option, the visualization processor is prepared to, depending on a change in device orientation, switch between displaying the content on the 3D screen and displaying the content on the 2D screen. A change in device orientation is usually due to a user action, for example it may be a consequence of the user turning the device to display the screen facing away from the screen he is currently viewing. To accommodate this, changing the device orientation is used to switch between displaying content on the 3D screen and displaying the content on the 2D screen. Advantageously, a change in orientation can be detected more reliably than an absolute degree of orientation. Advantageously, a change of orientation can be detected earlier than an absolute degree of orientation resulting from the change of orientation, therefore allowing the display processor to toggle earlier or to gradually introduce said toggle. [0015] As an option, the orientation means is composed of a first camera to estimate the orientation of the device in relation to the user, when detecting the presence of the user in a window of visualization of the first camera. The device's orientation in relation to the user is of particular relevance, since said orientation determines the visibility of any of the screens for the user. By detecting the user's presence in a viewport of the first camera, the user's location in relation to the first camera, and therefore with respect to the device itself, is known. As such, content can be set on either the 3D screen or the 2D screen in dependence on guidance data indicative of a user's location in relation to the device. [0016] As an option, the orientation means is composed of an accelerometer for establishing a direction of gravity, and the visualization processor is arranged to display the content on an up or down face of the 3D screen and the 2D screen, when comparing the orientation of the device in relation to the direction of gravity. The accelerometer is used to establish a face up, or alternatively, down, 3D screen and 2D screen to, regardless of device orientation, always display content on a face up, or alternatively, face down of said screens. This optional measure is based on the recognition that the user normally views the device from an upward direction, that is, he looks down at the device. In this way, the content can be displayed on the screen that has its face up. Alternatively, in some configurations, for example, when lying on a sofa, the user normally views the device from a downward direction, ie looking up at the device. For this reason, as an alternative, content can be displayed on the screen that is face down. [0017] Optionally, the device still comprises a user input to enable the user to configure the visualization processor to display the content either on the face up or face down of the 3D screen or the 2D screen. In this way, the user can select whether the content should, by default, be displayed on the screen that is face up or face down. Advantageously, the user can configure the device for various usage situations, eg to operate the device while lying on a sofa, standing, sitting in a chair, etc. [0018] As an option, the device further comprises a first camera arranged along with the 3D screen and a second camera that is arranged along with the 2D screen, the first camera having a device's first camera function, and the second camera having a second camera function of the device, and the visualization processor being prepared to, depending on the guidance data, swap the first camera function and the second camera function of the device. The device consists of two cameras facing opposite sides. The functionality of cameras usually refers to an orientation of the device in relation to the user. For example, in video chat applications, the user's front-facing camera can be used for recording and transmitting the user's video, while in photography applications, the user's back-facing camera can be used to take pictures . It is desirable that when the user rotates the device upside down to switch between 2D and 3D content viewing, the functionality of the cameras on the device is accordingly changed. To accommodate this, the first camera function of the first camera and the second camera function of the second camera are swapped in the device, and as a result, their functionality is swapped in, for example, a video chat app running on the device. device. [0019] As an option, the visualization processor is prepared to perform a 2D to 3D conversion when the content is originally 2D content for said content display as 3D content on the 3D screen, or perform a 3D conversion to 2D when the content is originally 3D content for the referred content display as 2D content on the 2D screen. When performing a 2D to 3D conversion, the 2D content is converted into a format that is viewable on the 3D screen. When performing a 3D to 2D conversion, the 3D content is converted into a format that is viewable on the 2D screen. This way, regardless of whether the content is originally 2D or 3D, it can be displayed in either the 3D screen or the 2D screen. [0020] As an option, the preview processor is arranged to display an indicator on the 2D screen when displaying an originally 3D content on the 2D screen in order to prompt the user to rotate the device inverted to obtain the preview of the 3D content on the 3D screen, or to display an indicator on the 3D screen when displaying an originally 2D content on the 3D screen in order to prompt the user to rotate the device inverted to view the 2D content on the 2D screen. Viewing natively 2D content on a 3D screen is typically less ideal than viewing that content on a 2D screen. Likewise, viewing natively 3D content on a 2D screen is typically less ideal than viewing that content on a 3D screen. By displaying an indicator to the user in the above mentioned situations, the user can see whether he is seeing an originally 2D content on the 3D screen or an originally 3D content on the 2D screen. Advantageously, the user is alerted to the fact that he can get the preview of the content in its original format by turning the device upside down. [0021] As an option, the device still comprises a first touch sensor arranged in or beside the 3D screen and a second touch sensor arranged in or beside the 2D screen. By providing a touch sensor on or beside both screens, touch-based interaction with the user is possible regardless of which screen the user is viewing. [0022] As an option, the first touch sensor has a first touch function on the device, the second touch sensor has a second touch function on the device, and the display processor is prepared to, depending on the orientation data, exchange the first touch function and the second touch function on the device. The device is made up of touch sensors, arranged on opposite sides of the device. The functionality of touch sensors usually refers to an orientation of the device in relation to the user. For example, the touch sensor that faces the user can be used for touch-based interaction with the user, while the touch sensor that faces away from the user can be configured to ignore touch input, or be used for additional touch-based interaction with the user. It is desirable, when the user rotates the device upside down to toggle between 2D and 3D content viewing, that the sensor functionality be switched accordingly. on the device. To accommodate this, the first sensor function of the first touch sensor and the second sensor function of the additional sensor are swapped in the device, and as a result, their functionality is swapped in, for example, an app running on the device. [0023] Optionally, the 3D screen is an autostereoscopic multiview 3D screen. A 3D autostereoscopic multi-view screen is suitable for use on a mobile device as the viewing angle of said screen is normally large compared to a 3D autostereoscopic 3D dual-view screen, ie a common type of autostereoscopic visualization. As such, viewing the content in 3D is possible even in cases when the device is not ideally oriented with respect to the user. [0024] It will be appreciated by those skilled in the technology that two or more of the aforementioned embodiments, implementations, and/or aspects of the invention may be combined in any way that is deemed useful. Modifications and variations of the tablet device, smart phone, method and/or computer program product, which correspond to the described modifications and variations of the mobile display device, can be performed by a person skilled in the technology based on the present description. The invention is defined in the independent claims. Advantageous options are defined in the dependent claims. [0025] A "Nintendo 3DS" product, as described on the manufacturer's website http:www.nintendo.com/3ds/hardware/specs on August 8, 2011, is a portable gaming console having a so-called factor of format with 'flip'. The product consists of, inside the flip system, an upper screen and a lower screen. The upper screen is referred to as an LCD screen that allows for 3D viewing without the need for special glasses, and the lower screen is referred to as an LCD screen having a touch-sensitive screen. It's even worth noting that one of the top screen's feature features is the ability to adjust or turn off the 3D effect with a so-called 3D depth slider. Brief Description of Drawings [0026] These and other aspects of the invention are apparent from, and will be elucidated with reference to, the embodiments described below. In the drawings, [0027] Figure 1a shows a mobile display device according to the present invention; [0028] Figure 1b shows the mobile display device after it has been rotated inverted; [0029] Figure 2 shows the mobile display device including an orientation means; [0030] Figure 3 shows a user rotating the mobile display device; [0031] Figure 4a shows a user operating the mobile display device while standing; [0032] Figure 4b shows a user operating the mobile display device while lying down; [0033] Figure 5a shows a mobile display device displaying originally 3D content on a 2D screen and displaying an indicator on the 2D screen; [0034] Figure 5b shows the mobile display device after it has been rotated inverted; [0035] Figure 6a shows a mobile display device running a video chat application; and [0036] Figure 6b shows the mobile viewing device with the functionality of the first and second cameras being exchanged in the video chat application, after having been rotated inverted. Detailed Description of Incorporations [0037] Figure 1a shows a mobile display device100, hereinafter referred to as device 100, composed of a 3D screen 120 organized on a first face 122 of the device 100. The 3D screen 120 allows the viewing of content in 3D, the content being the following is, by way of example, an image of a box. The 3D screen 120 is shown to display the content as a 3D content 124, which is illustrated in Figure 1a, by a drawing of the box in 3D perspective to be displayed on the 3D screen 120. Figure 1a further shows, illustratively, a rotating device 100 by means of curved arrows 102. Figure 1b. shows device 100 after being rotated inverted, with a second face 142 of device 100 being visible, which face is arranged opposite the first face 122 with respect to device 100, i.e. being an opposite face with respect to the first face 122. Device 100 comprises a 2D screen 140 disposed on second face 142 of device 100. 2D screen 140 is shown to display content as 2D content 144, which is illustrated in Figure 1b by a drawing of a 2D box. being displayed on the 2D screen 140. [0038] Although not shown in Figure 1a or Figure 1b, device 100 still comprises a visualization processor to display content as 3D content 124 on 3D screen 120, and to display content as 2D content 144 on 2D screen 140. Here, the term for displaying content as 2D content is to be understood as providing the content in a format that is displayable on the 2D screen 140, that is, as content formatted for viewing on the 2D screen 140. For example , if the content is originally 2D content, ie already in a format that is displayable on the 2D screen 140, the visualization processor may not be required to perform any content processing in order to provide the content as 2D content 144. If the content is originally 3D content, however, the processor may need to reformat the content, for example by omitting the content depth information, when the content is originally, so called, of the image format + depth, in order to provide only the image information of the image + depth format as the 2D 144 content. As another example, when the content is originally, so-called, of the left + right format, the reformat can understand the providing the image on the left of the content as 2D content 144 and omitting the image on the right. [0039] Likewise, the term for displaying content as a 3D content is to be understood as providing the content in a format that is viewable on the 3D screen 120. For example, when the content is originally 2D content, the processor may need reformatting the content such that the content is displayable on the 3D screen 120. Depending on the type of 3D screen 120 and the type of display processor, reformatting may include, for example, adding a depth map to the content to get the content 3D 124 in image + depth format. Likewise, reformatting can include providing each image of the content both as left image and right image for the purpose of getting the 3D content 124 in left + right format. In general, the format of the 3D content 124 is such that, when being displayed on the 3D screen 120, each of the user's eyes is provided with a separate view of the content. As a result, the user typically gains depth perception. This perception is known in the field of visual perception as stereopsis, with the creation of stereopsis being referred to as stereoscopy. It will be appreciated that both separate content viewing modes can actually be viewed as separate and, in any case, identical. For example, when each content image is provided as a left image and a right image to get the 3D content 124 in left + right format, the user may not get an impression of depth, even though they view the content as a 3D content 124 on the 3D screen 120. Also, even when the content is originally 3D content, the content may sometimes not provide any stereoscopy by, for example, showing scenes that do not contain any depth. [0040] It should be noted that the term content refers to any type of content that can be displayed on device 100, for example, movies, internet content, photos, or the visual output of applications, games, etc. In case the content is generated by device 100 itself, the display processor may not need to reformat the content. For example, content can be generated by an app or a game running on device 100. Instead of reformatting the content provided by the app or the game, the preview processor can instead instruct the app or game to immediately generate the content in the required format. For example, when device 100 is playing a game, the game can be instructed to also generate the depth information alongside the image information in order to immediately provide the content in an image + depth format so that it is displayed as a 3D content 124 on the 3D 120 screen. [0041] In this regard, it should be noted that the term 3D content is not to be understood as referring to, for example, when the content is a visual output of a game or application, that the game or application involves a 3D environment. . Such 3D environments are typically generated using computer graphics to portray the 3D environment through, for example, polygon meshes. While such apps and games are often referred to as 3D games and 3D apps, they do not necessarily involve stereoscopy. Instead, the term 3D content refers to the content being displayed on the 3D screen 120 to provide stereoscopy. Of course, it is not excluded that the 124 3D content also portrays a 3D environment. [0042] The processor can be arranged to perform a 2D to 3D conversion when the content is originally 2D content to display the content as 3D content 124 on the 3D screen 120. The term 2D to 3D conversion refers to the synthetic addition of depth or disparity to content in order to create the appearance of the content as being originally 3D content. Such conversions are known from the video processing field, for example, from the publication "Conversion from 2D to 3D: A Survey" by Qingqing Wei, Delft University of Technology, December 2005. It will be appreciated anyway. , that said conversion may also involve converting the content to a 3D content 124 by simply setting the content to a fixed depth or disparity. In addition, the visualization processor can be arranged to perform a 3D to 2D conversion when the content is originally 3D content to display the content as 2D content 144 on the 2D screen 140. The term 3D to 2D conversion refers to converting content that has gap or depth information to content that is displayable without the gap or depth information, for example, when removing a depth map from an image + depth format, or when calculating an intermediate image between an image on the left and an image on the right using interpolation techniques. [0043] Figure 1a and Figure 1b show a user, when rotating the device 100 inverted, obtaining both a view of the content in 3D on the 3D screen 120 and a view of the content in 2D on the 2D screen 140. For this purpose, the content it can be provided as a 3D content 124 on the 3D screen 120 at the same time as it is provided as a 2D content 144 on the 2D screen 140, so that, regardless of the orientation of the device 100, it can deliver the content to the viewer. Consequently, on both screens the content can be displayed simultaneously. [0044] Figure 2 shows a mobile display device 200 comprising, by way of example, two orientation means, i.e. a first camera 284 and an accelerometer 280. Each of the orientation means generates orientation data 282 indicative of a orientation of the device 200. Figure 2 shows, in a cross-sectional view of the device 200, the display processor 260. The display processor 260 is arranged to, depending on the orientation data 282, display the content both on the 3D screen 120 as on the 2D screen 140. Consequently, the display processor 260 can be arranged not to display the content on the other screen, for example, by turning off the other screen, by minimizing its brightness, or not providing the content to the other screen . [0045] The first camera 284 constitutes a first of the means of orientation. The first camera 284 is arranged along with the 3D screen 120 and thus faces approximately the same direction as the 3D screen 120. The first camera 284 is used on the device 200 to estimate the orientation of the device 200 with respect to the user , by detecting the user's presence in a viewport of the first camera 284. For this purpose, the first camera 284 and/or the device 200 may include face detection functionality to detect whether a face is present and/or where a face is situated in the viewport of the first camera 284. Face detection functionality can be provided by, for example, a face detection based on skin tone, Viola-Jones object detection, or any known type of face detection. It will be appreciated that when the user is detected within the view window of the first camera 284, it can be concluded that the device 200 is oriented such that the 3D screen 120 is facing the user. Likewise, when the user is assumed to be operating the device 200, if the user is not detected within the view window of the first camera 284, it can be concluded that the device 200 is oriented such that the 3D screen 120 is facing away from the user, and as a consequence, the 2D screen 140 arranged in opposition is facing towards the user. As such, the display processor 260 can be arranged to, in dependence on the orientation data 282, display the content on either the 3D screen 120 or the 2D screen 140 depending on whether the user has been detected within the display window of the first camera 284 or not. [0046] Note that the first camera 284 may also have other functionalities within the device 200. In particular, the first camera 284 may additionally or primarily be used to, for example, take photos, record videos, provide the functionality of video chat, etc. In addition, the device 200 may include a first camera that does not function as an orientation means, but instead comprises another form of orientation means or has no orientation means at all. [0047] The cross-sectional view still shows the accelerometer 280 constituting a second among the means of orientation. The accelerometer 280 estimates the orientation of the device 200 by establishing a gravity direction. Since the orientation of accelerometer 280 within device 200 is known, it can be determined how device 200 is oriented with respect to the direction of gravity and as such the orientation of device 200 can be established with respect to the direction of gravity. When assuming, for example, that the user normally views device 200 when looking in a downward direction at device 200, the orientation of device 200 with respect to the direction of gravity is indicative of the orientation of device 200 with respect to the user. Accordingly, display processor 260 can display content on any of the screens using the orientation of device 200 with respect to the direction of gravity and an assumption of the user's viewing direction. [0048] It will be appreciated that any other type of guidance means may also be used in place of or in addition to the accelerometer 280 and the first camera 284. For example, device 200 may include a gyroscope as a guidance means. In particular, an orientation means that specifically sets the orientation of the device 200 with respect to the user may be used. For example, the guiding means may comprise a touch-sensitive surface disposed along one or both screens for detecting the position of the user's hand on device 200. The touch-sensitive surface may be arranged to, for example, feel if the user's fingers or thumb are placed on it. Therefore, when assuming that the user places his thumb on the face of the device 200 that is facing towards him and his fingers on the face of the device 200 that is facing outward, the orientation of the device 200 with respect to the user can be established by sensing which face the fingers and/or thumb are placed on. [0049] Figure 3 shows a user 300 holding the device 100, 200, while also, illustratively, showing a subsequent rotation of the device 100, 200, by means of curved arrows 102 and the dashed outline. When initially holding device 100, 200, the display processor can display content on the face-up screen by using, for example, an accelerometer to determine which screen is facing up, or by using a camera to determine which of the faces is facing the user, that is, the screen being face up in the situation shown in Figure 3. In this regard, it should be noted that the face facing up is understood as the one facing more up that downwards, that is, in an upward direction, and downwards as the face facing the ground or in the direction of gravity. [0050] When you later rotate the device 100, 200, for example, to switch between viewing the content in 2D and viewing the 3D, the display processor can again determine which of the screens is facing up and/or the one facing the user using the means of guidance and then displaying the content in it. The visualization processor can be configured to always display content on the screen facing up and/or facing the user. As a result, the visualization processor can switch between displaying the content on the 3D screen and displaying the content on the 2D screen when or after the user rotates the device 100, 200 inverted. Of course, the visualization processor can also be configured to always display content on the screen that is facing downwards, as discussed in reference to Figure 4b. [0051] In addition, or alternatively, the orientation means can determine a change in the orientation of the device 100, 200, and the visualization processor can switch between displaying the content on the 3D screen and displaying the content on the 2D screen with based on the change of orientation. Thus, in addition to or as an alternative to using the orientation of the device 100, 200, to switch between viewing the content on the 3D screen and viewing the content on the 2D screen, said change of orientation can be used. [0052] Figure 4a shows the user 300 operating the device 100, 200, while standing or sitting, while also, illustratively, shows the normal vector 302 of the display screen surface that is facing the user 300, the normal vector 304 of the display screen surface that is facing away from the user 300, as well as the direction of gravity 306. Device 100, 200, can be arranged to always display content on the screen that is facing up. For this purpose, an accelerometer can be used to determine which of the normal vectors 302, 304 is more aligned with the direction of gravity 306, which of the normal vectors 302, 304 is less aligned with the direction of gravity 306. This may allow that it be established which of the screens is facing upwards and which among the screens is facing downwards. User 300 can be assumed to be viewing device 200 from above, as would normally be the case when standing or sitting. The visualization processor, therefore, can be arranged to always display content on the screen that is facing up. [0053] Figure 4b shows the user 310 operating the device 100, 200 while lying down, for example, on a sofa. In this situation, the user 310 can be considered to be viewing the device 100, 200, from the bottom up, and the display processor can be arranged to always display the content on the screen that is facing down. To toggle between the above two display processor settings, that is, displaying content on the screen that is facing up or displaying content on the screen that is facing downward, device 100, 200 can include a user input to allow the user 310 to select between the two settings, that is, to configure the display processor to display the content on both the up and down screen. User input can take the form of a physical switch or button disposed on device 100, 200, a user interface button that is displayed on one or both screens and is touch operable, an operable user input by voice or in any other appropriate form. [0054] It will be appreciated that when composed of a camera as a means of guidance, the processor can always display content on the screen that is facing the 310 user without the need to configure the display processor between the two aforementioned configurations and without the need to assume the viewing direction of user 310. Thus, indication to device 100, 200, whether user 310 is standing, sitting or lying down, may not be necessary. Device 100, 200, however, may include a user input to allow the user 310 to manually select the screen on which the content is to be displayed. As such, user input may function as a priority trigger device, for example, when it may not be reliably determined which screen is facing the user 310. The latter may be due to, for example, insufficient lighting conditions. Device 100, 200 may also include user input to allow user 310 to manually select any of the screens when device 100, 200 does not comprise guidance means. Alternatively, when device 100, 200 does not comprise an orientation means, the display processor can be configured to display content on both screens simultaneously. [0055] Figure 5a shows the device 100 of Figure 1a and Figure 1b, in which the display processor is arranged to additionally display an indicator 146 on the 2D screen 140 when displaying an originally 3D content on the 2D screen 140 for the purpose of prompt the user to rotate device 100 upside down to view the 3D content on the 3D screen 120. Indicator 146 can be a graphic, for example, an icon, or text. The visualization processor can determine that the content is originally 3D content by, for example, determining whether a reformat of the content was required in order to display the content on the 2D screen 140. Figure 5b shows a result from the user to the rotate device 100 inverted, whereby the content is displayed as 3D content 124 on the 3D screen 120, that is, in its original format. [0056] Alternatively or additionally, the display processor can be arranged to display the indicator 146 on the 3D screen 120 when displaying an originally 2D content thereon for the purpose of alerting the user to rotate the device 100 inverted in order to get the preview of the 2D content on the 2D screen 140. Both indicators can be the same or similar, for example, just indicating to rotate device 100 inverted, or they can be different in order to indicate if the content is content originally 3D or originally 2D content to indicate on which screen it should be displayed. [0057] Figure 6a shows the device 200 of Figure 2 running a video chat application. Here, the user is facing the 3D screen 120, which is apparent from the video chat application showing the view window of the first camera 284 in a window 288 and the user being visible in the view window. Window 288 thus shows what the first camera 284 is recording and transmitting to the other party. The user may decide to rotate the device 200 upside down, for example, to use the video chat app on the 2D screen 140. Figure 6b shows the user's result of rotating the device 200 upside down. Here it is visible that the device 200 is composed of a second camera 286 arranged together with the 2D screen 140. Furthermore, the visualization processor is prepared to, depending on the orientation data, switch the use of the first camera 284 as a device. video chat recording device for using the second camera 286 as a video chat recording device. As a result, after rotating the device 200 inverted, the second camera 286 is now used to record the user and transmit the result to the other party. This is apparent from the viewport shown in window 288 being substantially identical to that of Figure 6a, ie again showing the user. [0058] It will be appreciated, that in general, the first camera may have a first camera function on the device, the second camera may have a second camera function on the device, and the visualization processor may be prepared for, depending on the data of Orientation, switch the first camera function and the second camera function on the device by switching between viewing the content on the 3D screen and the 2D screen. In addition, or alternatively, the visualization processor may, after having determined which screen is facing the user, assign a first camera function to the camera next to that screen and assign a second camera function. to the other camera placed next to the other screen. The first camera function can cover, for example, shooting self-portrait photos or recording video for use in a video chat application, whereas the second camera function can cover, for example, a photography functionality or camera. [0059] Furthermore, the first camera and the second camera can together constitute a means of orientation. For this purpose, the device may include a face detection functionality to detect whether a face is present in the viewport of the first camera and/or whether a face is present in the viewport of the second camera. Consequently, it may not be necessary to assume that when the user is not present in the preview window of the first camera arranged along with the 3D screen, that the user is facing the 2D screen, as this can be determined directly by detecting if the user is present in the viewport of the second camera. [0060] The device may include a first touch sensor arranged in or along the 3D screen and a second touch sensor arranged in or along the 2D screen. In particular, the 3D screen can include a first touch-sensitive layer to allow the user to interact with the content displayed on the 3D screen. As such, the 3D screen acts as a user input device. In addition, the 2D screen can include a second touch-sensitive layer to allow the user to interact with the content displayed on the 2D screen. Consequently, each of the 3D screen and the 2D screen, as shown in the Figures, can include a touch sensitive layer, and as a result, a user input device can be provided on both sides of the device. The first touch sensor may have a first touch function on the device, the second touch sensor may have a second touch function on the device, and the display processor may be arranged to, depending on the guidance data, exchange the first touch function and the second touch function on the device. In particular, the first touch function may refer to allowing a touch-based interaction between the user and the device, and the second touch function may relate to preventing all touch-based interaction, ie, the second touch function can correspond to the effective disabling of the second touch sensor for touch-based interaction. As such, the visualization processor can swap the first touch function and the second touch function in order to allow touch-based interaction on the face of the device that is facing the user, and avoid touch-based interaction on the opposite side of the device, that is, the one facing away from the user. [0061] The 3D screen can be an auto-stereoscopic multi-view 3D screen. Such screens are known per se from the technical field of 3D screens and are particularly suitable for use in mobile computing as, due to the autostereoscopic aspect, the user does not need the user to wear glasses or similar devices to obtain a 3D view and, due to the multi-view aspect, it provides a wide range of viewing positions suitable for 3D viewing. As an alternative, autostereoscopic 3D stereo screens can be used that emit two viewing modes to allow for glasses-free stereoscopy, but in a smaller range of viewing positions. [0062] Alternatively, non-autostereoscopic screens can be used, thus requiring the user to, for example, wear shutter-based or polarizer-based glasses to obtain 3D viewing. In this case, the orientation means can be organized to determine the orientation of the device in relation to the user, determining the orientation of the device with respect to the glasses, that is, the location of and/or the presence of the glasses is considered as a indication of the location of and/or the presence of the user. For example, when the orientation means is constituted by a camera, the camera can be arranged to detect a location of and/or the presence of the glasses in the viewing window to estimate the orientation of the device in relation to the user. Furthermore, the glasses can be arranged to facilitate detection by the guidance means. [0063] The 3D screen and the 2D screen can be of substantially equal size. In particular, both screens can have the same light modulation part and thus differ mainly by the 3D screen, additionally comprising a visualization forming part, for example a lenticular lens array. [0064] The device as described above can be a tablet device, that is, be able to run applications and consist of a touch screen with a diagonal screen of approximately 5" or larger. The device can also be a smart phone, that is, be able to run applications, provide phone functionality, and consist of a touch screen with a diagonal screen between approximately 3" and 5". The device can also be a digital photo frame, an optical disc reader laptop, a digital media player, or any similar mobile viewing device. [0065] It should be noted that the aforementioned embodiments illustrate rather than limit the invention, and that those skilled in the technology will be able to design many alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed in parentheses cannot be interpreted as a limitation to the claim. The use of the verb "to understand" and its conjugations does not exclude the presence of elements or stages different from those stated in a claim. The article "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention can be implemented by means of hardware, which includes several distinct elements, and by means of a properly programmed computer. In the device claim enumerating multiple media, several of these media can be incorporated by the same piece of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
权利要求:
Claims (14) [0001] 1. Mobile display device (100, 200) to enable a user (300, 310) to obtain both a three-dimensional [3D] and two-dimensional [2D] content visualization, characterized by the fact that it comprises:- a 3D screen (120) to enable viewing of 3D content; - a 2D screen (140) to enable viewing of 2D content; - a display processor (260) for displaying the content as 3D content (124) on the 3D screen and for displaying the content as a 2D content (144) on the 2D screen; - the 3D screen and the 2D screen being arranged on opposite faces (122, 142) of the device to enable the user to get either the 3D content view or the 2D view 2D by rotating (102) the inverted device; wherein the display processor (260) is arranged to:- display an indicator (146) on the 2D screen (140) when displaying originally 3D content on the 2D screen to alert the user to rotate the device upside down to get a 3D view of the content on the 3D screen (120); or- display the indicator on the 3D screen when displaying originally 2D content on the 3D screen to prompt the user to rotate the device inverted to get 2D view of the content on the 2D screen. [0002] 2. Mobile display device (200) according to claim 1, characterized in that it further comprises an orientation means (280, 284) for generating orientation data (282) indicative of an orientation (302, 304) of the device, and the visualization processor (260) being prepared to, depending on the orientation data, display the content on both the 3D screen (120) and the 2D screen (140). [0003] 3. Mobile display device (200) according to claim 2, characterized in that the display processor (260) is prepared to, depending on a change in device orientation, switch between displaying the content in the 3D screen (120) and the display of content on the 2D screen (140). [0004] 4. Mobile display device (200) according to claim 2, characterized in that the orientation means (280, 284) is composed of a first camera (284) to estimate the orientation of the device in relation to the user (300, 310) by detecting a user presence in a first camera viewport. [0005] 5. Mobile display device (200) according to claim 2, characterized in that the guidance means (280, 284) is composed of an accelerometer (280) for establishing a direction of gravity (306) , and the display processor (260) is arranged to display the content on one of the 3D screen (120) and the 2D screen (140) facing up or down, by comparing the orientation (302, 304) of the device in relation to the direction of gravity. [0006] 6. Mobile display device (200) according to claim 5, further comprising a user input to allow the user (300, 310) to configure the display processor (260) to display content in both of the 3D screen (120) and the 2D screen (140) facing up or down. [0007] 7. Mobile display device (200) according to claim 1, characterized in that it further comprises a first camera (284) arranged together with the 3D screen (120) and a second camera (286) arranged together with the 2D screen (140), the first camera having a first camera function in the device, the second camera having a second camera function in the device, and the visualization processor (260) being arranged for, in dependence on the orientation data (282 ), swap the first camera function and the second camera function on the device. [0008] 8. Mobile display device (200) according to claim 1, characterized in that the display processor (260) is arranged to:- perform a 2D to 3D conversion when the content is originally 2D to the said displaying the content as a 3D content (124) on the 3D screen (120); or- perform a 3D to 2D conversion when the content is originally 3D content for said content display as 2D content (144) on the 2D screen (140). [0009] 9. Mobile display device (200) according to claim 1, characterized in that it further comprises a first touch sensor arranged in or along the 3D screen (120) and a second touch sensor arranged in or along the along the 2D screen (140). [0010] 10. Mobile display device (200) according to claim 9, characterized in that the first touch sensor has a first touch function on the device, the second touch sensor has a second touch function on the device, and the display processor (260) being arranged to, in dependence on the guidance data (282), exchange the first touch function and the second touch function in the device. [0011] 11. Mobile display device (200) according to claim 1, characterized in that the 3D screen (120) is an autostereoscopic 3D multi-display screen. [0012] 12. Tablet device, smartphone or digital photo frame, characterized in that it comprises the mobile display device (100, 200), as defined in claim 1. [0013] 13. Method of enabling a user to obtain both a three-dimensional [3D] and two-dimensional [2D] content visualization using a mobile display device, the mobile display device characterized by the fact that it comprises:- a 3D screen for allow viewing of content in 3D;- a 2D screen to allow viewing of content in 2D; and - the 3D screen and the 2D screen being arranged on opposite faces of the device; and the method comprising: - generating orientation data indicative of an orientation of the device; and - the display, depending on the orientation data, of content both as a 3D content on the 3D screen and as a 2D content on the 2D screen in order to enable the user to obtain either a 3D content preview or a preview in 2D, by rotating the device inverted; e- display an indicator on the 2D screen when displaying originally 3D content on the 2D screen to alert the user to rotate the device inverted to obtain a 3D view of the content on the 3D screen, or display an indicator on the 3D screen when displaying content originally 2D on 3D screen to prompt the user to rotate the device upside down to get 2D view of the content on the 2D screen. [0014] 14. A computer-readable medium characterized in that it comprises instructions for causing a processor system to perform the method as defined in claim 13
类似技术:
公开号 | 公开日 | 专利标题 BR112014016867B1|2021-09-08|MOBILE VIEWING DEVICE, A METHOD TO ALLOW A USER TO OBTAIN BOTH A THREE-DIMENSIONAL CONTENT AND A TWO-DIMENSIONAL CONTENT VIEW WITH THE USE OF A COMPUTER-READABLE MOBILE VIEWING DEVICE US9088771B2|2015-07-21|Mobile terminal and operation control method thereof US9030487B2|2015-05-12|Electronic device for displaying three-dimensional image and method of using the same US9741150B2|2017-08-22|Systems and methods for displaying representative images US10412376B2|2019-09-10|Apparatus for controlling a pixel arrangement in a display unit US9389703B1|2016-07-12|Virtual screen bezel US9007404B2|2015-04-14|Tilt-based look around effect image enhancement method BRPI1104449A2|2013-01-22|information processing apparatus and method, computer implemented method, and computer readable storage medium KR20130071059A|2013-06-28|Mobile terminal and method for controlling thereof KR101629313B1|2016-06-10|Mobile terminal and method for controlling the same EP2803198B1|2021-04-28|3d display apparatus and method thereof JP2011191384A|2011-09-29|Display device JP2012256104A|2012-12-27|Information processing apparatus, information processing method, and program JP5349658B2|2013-11-20|Information processing apparatus, information processing method, and program JPWO2018116580A1|2019-11-07|Information processing apparatus, information processing method, and program JP2015038650A|2015-02-26|Information processor and information processing method JP5776964B2|2015-09-09|Information display device, display control method, and program KR20150136715A|2015-12-08|Method of realizing 3D pictures using the gyroscope embedded in smart devices
同族专利:
公开号 | 公开日 JP2015510710A|2015-04-09| BR112014016867A2|2017-06-13| CA2863058C|2021-10-19| TWI562589B|2016-12-11| US20170251205A1|2017-08-31| IL233439A|2020-03-31| US20150009303A1|2015-01-08| IL233439D0|2014-08-31| RU2623201C2|2017-06-27| BR112014016867A8|2017-07-04| WO2013104518A1|2013-07-18| RU2014132707A|2016-02-27| CN104040457B|2018-07-13| JP2017208821A|2017-11-24| KR102059969B1|2020-02-11| US9681121B2|2017-06-13| CA2863058A1|2013-07-18| EP2802958B1|2019-11-13| KR20140129010A|2014-11-06| EP2802958A1|2014-11-19| TW201334507A|2013-08-16| ES2769787T3|2020-06-29| JP6158223B2|2017-07-05| US10225547B2|2019-03-05| CN104040457A|2014-09-10|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US6392689B1|1991-02-21|2002-05-21|Eugene Dolgoff|System for displaying moving images pseudostereoscopically| AT192275T|1993-12-03|2000-05-15|Terumo Corp|STEREOSCOPIC IMAGE DISPLAY SYSTEM| GB2294350A|1994-10-21|1996-04-24|Sharp Kk|Light source and display| US5986634A|1996-12-11|1999-11-16|Silicon Light Machines|Display/monitor with orientation dependent rotatable image| US6144358A|1997-08-20|2000-11-07|Lucent Technologies Inc.|Multi-display electronic devices having open and closed configurations| US6788292B1|1998-02-25|2004-09-07|Sharp Kabushiki Kaisha|Display device| US20080062625A1|1999-10-18|2008-03-13|Jeffrey Batio|Portable computer for dual, rotatable screens| JP2001236051A|1999-11-30|2001-08-31|Nokia Corp|Back-face interactive handheld display device| JP2001326947A|2000-05-12|2001-11-22|Sony Corp|Stereoscopic image display device| US7170500B2|2000-08-29|2007-01-30|Palm, Inc.|Flip-style user interface| US6798429B2|2001-03-29|2004-09-28|Intel Corporation|Intuitive mobile device interface to virtual spaces| JP2002372929A|2001-06-13|2002-12-26|Namco Ltd|Both-side display and electronic appliance| GB0129992D0|2001-12-14|2002-02-06|Ocuity Ltd|Control of optical switching apparatus| JP2003186426A|2001-12-14|2003-07-04|Minolta Co Ltd|Display device making display on both top and reverse surfaces| JP2005520193A|2002-03-13|2005-07-07|コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ|Dual display device| KR100828531B1|2002-07-26|2008-05-13|삼성전자주식회사|Liquid crystal display| JP3969252B2|2002-08-27|2007-09-05|日本電気株式会社|Stereoscopic image plane image switching display device and portable terminal device| KR100637303B1|2002-09-26|2006-10-24|샤프 가부시키가이샤|2d/3d switch liquid crystal display panel and 2d/3d selection liquid crystal display| JP4515259B2|2002-10-30|2010-07-28|株式会社半導体エネルギー研究所|Display device and electronic device| JP2004187182A|2002-12-06|2004-07-02|Hitachi Ltd|Mobile terminal device| US20040203520A1|2002-12-20|2004-10-14|Tom Schirtzinger|Apparatus and method for application control in an electronic device| AU2003289348A1|2002-12-27|2004-07-29|Semiconductor Energy Laboratory Co., Ltd.|Display and electronic device| JP2004357156A|2003-05-30|2004-12-16|Sharp Corp|Video reception apparatus and video playback apparatus| US8154686B2|2004-01-20|2012-04-10|Sharp Kabushiki Kaisha|Directional backlight, a multiple view display and a multi-direction display| JP2005092702A|2003-09-19|2005-04-07|Toshiba Corp|Information processor| US7837558B2|2004-03-31|2010-11-23|Nintendo Co., Ltd.|Game console and emulator for the game console| CN2705975Y|2004-05-24|2005-06-22|天瀚科技股份有限公司|Electronic device having at least two shaft turning function| KR100630162B1|2004-05-28|2006-09-29|삼성전자주식회사|Sliding-type portable communication device having dual liquid crystal display| FR2876805B1|2004-10-18|2007-01-05|Artistic Images Sarl|DEVICE AND METHOD FOR AUTOSTEREOSCOPIC VISUALIZATION BASED ON LENTICULAR, AND METHOD FOR SYNTHESIZING AUTOSTEREOSCOPIC IMAGES| GB2419454A|2004-10-19|2006-04-26|Pranil Ram|Multiple monitor display apparatus| US7961196B2|2005-05-13|2011-06-14|Koninklijke Philips Electronics N.V.|Cost effective rendering for 3D displays| US7483939B2|2005-08-25|2009-01-27|General Electric Company|Medical processing system allocating resources for processing 3D to form 2D image data based on report of monitor data| TW200720703A|2005-09-09|2007-06-01|Koninkl Philips Electronics Nv|Lenticular device for an autostereoscopic display apparatus and method of producing the same| US7844301B2|2005-10-14|2010-11-30|Lg Electronics Inc.|Method for displaying multimedia contents and mobile communications terminal capable of implementing the same| US20070188450A1|2006-02-14|2007-08-16|International Business Machines Corporation|Method and system for a reversible display interface mechanism| JP2010501887A|2006-08-24|2010-01-21|コーニンクレッカフィリップスエレクトロニクスエヌヴィ|Warpage reduction method for switchable liquid crystal lens array| AT471529T|2006-12-19|2010-07-15|Koninkl Philips Electronics Nv|AUTOSTEREOSCOPIC DISPLAY DEVICE AND SYSTEM THEREWITH| JP5270120B2|2007-07-25|2013-08-21|グローバル・オーエルイーディー・テクノロジー・リミテッド・ライアビリティ・カンパニー|Dual display device| GB0715976D0|2007-08-15|2007-09-26|Wren Hilton Giles M|Apparatus and method for automatically changing the profile or mode of a mobile phone| EP2208106A1|2007-10-04|2010-07-21|3M Innovative Properties Company|Embedded stereoscopic 3d display and 2d display film stack| US8130207B2|2008-06-18|2012-03-06|Nokia Corporation|Apparatus, method and computer program product for manipulating a device using dual side input devices| WO2010030985A1|2008-09-12|2010-03-18|Gesturetek, Inc.|Orienting displayed elements relative to a user| KR20100036462A|2008-09-30|2010-04-08|삼성전자주식회사|Image display apparatus and method for both 2d and 3d image| JP4980338B2|2008-12-26|2012-07-18|シャープ株式会社|Image display device| CN201364528Y|2009-03-13|2009-12-16|深圳帝光电子有限公司|Dual-screen notebook computer| WO2010109396A1|2009-03-25|2010-09-30|Koninklijke Philips Electronics N.V.|Liquid crystal cell manufacture| KR101622216B1|2009-07-23|2016-05-18|엘지전자 주식회사|Mobile terminal and method for controlling input thereof| KR101617289B1|2009-09-30|2016-05-02|엘지전자 주식회사|Mobile terminal and operation control method thereof| JP5267421B2|2009-10-20|2013-08-21|ソニー株式会社|Imaging apparatus, image processing method, and program| US8781737B2|2009-11-20|2014-07-15|Qualcomm Incorporated|Spatial alignment determination for an inertial measurement unit | US9414041B2|2009-11-23|2016-08-09|Samsung Electronics Co., Ltd.|Method for changing play mode, method for changing display mode, and display apparatus and 3D image providing system using the same| RU92595U1|2009-12-24|2010-03-20|Михаил Михайлович Слепко|DEVICE FOR FORMING 3D STEREOSCOPIC IMAGE FROM 2D MONOSCOPIC IMAGE| US20110157329A1|2009-12-28|2011-06-30|Acer Incorporated|Method for switching to display three-dimensional images and digital display system| KR20110081040A|2010-01-06|2011-07-13|삼성전자주식회사|Method and apparatus for operating content in a portable terminal having transparent display panel| JP5495877B2|2010-03-24|2014-05-21|株式会社Nttドコモ|Display device and image display method| JP5527813B2|2010-04-26|2014-06-25|Necカシオモバイルコミュニケーションズ株式会社|Terminal device and program| KR101781846B1|2010-05-19|2017-09-26|엘지전자 주식회사|Mobile terminal and Method for controlling image display of the same| US9507161B2|2010-07-28|2016-11-29|Koninklijke Philips Electronics N.V.|Optical beam deflection arrangement and multi-view display having the same| US8913056B2|2010-08-04|2014-12-16|Apple Inc.|Three dimensional user interface effects on a display by using properties of motion| EP2437513A1|2010-10-01|2012-04-04|Hitachi Consumer Electronics Co. Ltd.|Receiver for 2D and 3D video content| KR20100117053A|2010-10-01|2010-11-02|김밀한|A smartphone with dual touch screen and its operational way| EP3678035A1|2010-12-21|2020-07-08|QUALCOMM Incorporated|Computerized method and device for annotating at least one feature of an image of a view| US8657680B2|2011-05-31|2014-02-25|United Video Properties, Inc.|Systems and methods for transmitting media associated with a measure of quality based on level of game play in an interactive video gaming environment| KR102059969B1|2012-01-11|2020-02-11|울트라-디 코퍼라티에프 유.에이.|Mobile display device| US20140254010A1|2013-03-07|2014-09-11|National Taiwan University|Protective shell capable of providing naked-eye 3d imaging effect| CN103676253B|2013-12-03|2016-08-17|深圳市华星光电技术有限公司|Display device and the method for display image thereof|US8842057B2|2011-09-27|2014-09-23|Z124|Detail on triggers: transitional states| KR102059969B1|2012-01-11|2020-02-11|울트라-디 코퍼라티에프 유.에이.|Mobile display device| US10592064B2|2013-09-17|2020-03-17|Amazon Technologies, Inc.|Approaches for three-dimensional object display used in content navigation| US10067634B2|2013-09-17|2018-09-04|Amazon Technologies, Inc.|Approaches for three-dimensional object display| US20150082145A1|2013-09-17|2015-03-19|Amazon Technologies, Inc.|Approaches for three-dimensional object display| KR101648119B1|2014-10-29|2016-08-16|에스케이씨하스디스플레이필름|Film for autostereoscopic display| JP2017148162A|2016-02-23|2017-08-31|株式会社Sjメディカル|Blood pressure data display terminal, blood pressure data display method, and blood pressure data display program| WO2019023852A1|2017-07-31|2019-02-07|Tencent TechnologyCompany Limited|Interaction with a three-dimensional internet content displayed on a user interface| CN108289216A|2018-01-15|2018-07-17|歌尔科技有限公司|head-mounted display apparatus and video display system| JP2021507790A|2018-02-16|2021-02-25|コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V.|Ergonomic display and activation in handheld medical ultrasound imaging equipment| JP6927103B2|2018-03-19|2021-08-25|オムロン株式会社|Display device and display method and gaming machine| WO2020032097A1|2018-08-10|2020-02-13|有限会社藤川樹脂|Mobile terminal and management server| CN109491561B|2018-09-27|2021-02-12|维沃移动通信有限公司|Image display method and terminal| JP2022512887A|2018-11-01|2022-02-07|レイア、インコーポレイテッド|Context lightfield display system, multi-view display, and method|
法律状态:
2018-03-27| B15K| Others concerning applications: alteration of classification|Ipc: G06F 1/16 (2006.01), H04N 13/00 (2018.01) | 2018-12-04| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2019-12-10| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2021-06-01| B350| Update of information on the portal [chapter 15.35 patent gazette]| 2021-07-06| B09A| Decision: intention to grant [chapter 9.1 patent gazette]| 2021-09-08| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 21/12/2012, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 EP12150796.6|2012-01-11| EP12150796|2012-01-11| PCT/EP2012/076754|WO2013104518A1|2012-01-11|2012-12-21|Mobile display device| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|