专利摘要:
VISUALIZATION DEVICE MOUNTED ON THE HEAD THAT ACHIEVES ONE OR MORE REFLECTIVE OPTICAL SURFACES. Head-mounted displays (100) which include a frame (107), an image display system (110) supported by the frame (107), and a reflective surface, for example, an ultra-wide angle reflective optical surface, are described and free space (FS / UWA / RO surface) (120), supported by the frame (107). In certain embodiments, the reflective surface (120) produces spatially separate virtual images that are angularly separated by at least 100, 150 or 200 degrees. Also described are methods and an apparatus for designing reflective optical surfaces, including reflective optical surfaces, including FS / UWA / RO surfaces, for use on head mounted displays (100).
公开号:BR112013009826B1
申请号:R112013009826-0
申请日:2011-12-20
公开日:2021-01-26
发明作者:Gregory A. Harrison;David Alan Smith;Gary E. Wiese
申请人:Lockheed Martin Corporation;
IPC主号:
专利说明:

[0001] [001] This order claims priority to U.S. Order No. 13 / 211,372, filed on August 17, 2011, and U.S. Provisional Application No. 61 / 405.440 (entitled HEAD MOUNTED VIEWER, deposited on October 21, 2010), to U.S. Provisional Application No. 61 / 417,325 (entitled ARCHITECTURE FRESNEL STACKED CURVE, filed on November 26, 2010), to U.S. Provisional Application No. 61 / 417,326 (entitled CURVED BEAM DIVIDER ARCHITECTURE filed on November 26, 2010), to U.S. Provisional Application No. 61 / 417,327 (entitled COMBINED FRESNEL LENS ARCHITECTURE AND FLAT BEAM DIVIDER, filed November 26, 2010), to U.S. Provisional Application No. 61 / 417,328 (entitled FRESNEL LENS AND CURVED BEAM DIVIDER COMBINED ARCHITECTURE, filed November 26, 2010), and U.S. Provisional Order No. 61 / 427.530 (entitled CURVE MIRROR FOR VIEWER MOUNTED ON THE HEAD, deposited on December 28, 2010, which are hereby incorporated by reference in their entirety. Field
[0002] [002] This description refers to a head-mounted display device that employs one or more reflective optical surfaces, for example, one or more ultra-wide angle and free space reflective optical surfaces (hereinafter abbreviated as "surfaces "FS / UWA / RO"). More particularly, the description refers to a head-mounted display device on which reflective optical surfaces, such as FS / UWA / RO surfaces, are used to display images from a video viewing system. light emission kept in close proximity to a user's eye. Background
[0003] [003] A viewfinder mounted on the head, such as a viewfinder mounted on a helmet or a viewfinder mounted on glasses (abbreviated here as "HMD"), is a viewing device worn on an individual's head, which features one or more small viewing devices located near one eye or, more commonly, both eyes of the user. Figure 1 shows the basic elements of a type of HMD that includes a viewfinder 11, a reflective optical surface 13, and an eye 15 showing a center of rotation 17. As shown in this figure, light 19 from viewfinder 11 is reflected by the surface 13 and enters the user's eye 15.
[0004] [004] Some HMDs display only simulated (computer-generated) images, as opposed to real-world images, and, consequently, are generally referred to as "virtual reality" or immersive HMDs. Other HMDs overlay (combine) a simulated image over a non-simulated image from the real world. The combination of non-simulated and simulated images allows the HMD user to view the world, for example, through a viewfinder or glasses in which additional data relevant to the task to be performed is superimposed on the front field of view (FOV) of the user. This overlap is sometimes referred to as "augmented reality" or "mixed reality".
[0005] [005] The combination of a non-simulated view of the real world with a simulated image can be obtained using a partially reflective / partially transmissive optical surface (a "beam splitter") where the reflectivity of the surface is used to display the simulated image as a virtual image (in the optical sense) and the transmissiveness of the surface is used to allow the user to view the real world directly (referred to as the "diaphanous optical system" dome). The combination of a real-world view with a simulated image can also be done electronically by accepting a video of a real-world view from a camera and electronically merging it with a simulated image using a combiner (referred to as "diaphanous video system "). The combined image can then be presented to the user as a virtual image (in the optical sense) by means of a reflective optical surface, which, in this case, does not need to have transmissive properties.
[0006] [006] From the background, it can be seen that the reflective optical surfaces can be used in HMDs that provide the user with: (i) a combination of a simulated image and a non-simulated image of the real world, (ii) a combination a simulated image and a real-world video image, or (iii) purely simulated images. (The latter case is generally referred to as an "immersive" system.) In each of these cases, the reflective optical surface produces a virtual image (in the optical sense) that is viewed by the user. Historically, such reflective optical surfaces were part of optical systems whose output pupils substantially limited not only the dynamic field of view available to the user, but also the static field of view. Specifically, to see the image produced by the optical system, the user needed to align his eye with the exit pupil of the optical system and keep it aligned, and even then, the image visible to the user did not cover the entire static field of view, that is, the previous optical systems used in HMDs that employed reflective optical surfaces were part of pupil formation systems and, therefore, were limited to the outgoing pupil.
[0007] [007] The reason the systems were so limited is the fundamental fact that the human field of view is remarkably large. Thus, the static field of view of a human eye, including both foveal and peripheral vision of the eye, is in the order of ~ 150 ° in the horizontal direction and in the order of ~ 130 ° in the vertical direction. (For the purposes of this description, 150 degrees will be used as the static forward field of view of a nominal human eye.) Well-corrected optical systems featuring outgoing pupils capable of accommodating such a large static field of view are very few, and when they do exist , have a high cost and are bulky.
[0008] [008] Furthermore, the operational field of view of the human eye (dynamic field of view) is even greater, since the eye can rotate around its center of rotation, that is, the human brain can aim at the field of view foveal + peripheral of the human eye in different directions with the change in the direction of the eye's gaze. For a nominal eye, the vertical movement range is in the order of ~ 40 ° above and ~ 60 ° below and the horizontal movement range is in the order of ± ~ 50 ° forward. For an exit pupil the size produced by the types of optical systems previously used in HMDs, even a small rotation of the eye would substantially reduce the overlap between the static field of view of the eye and the exit pupil, and larger rotations would make the image disappear completely. Although theoretically possible, an exit pupil that moves in sync with the user's eye would be impractical and cost prohibitively high.
[0009] [009] In view of these properties of the human eye, there are three fields of view that are relevant in terms of providing an optical system that allows a user to view an image generated by an image viewing system in the same way that he would view the world Natural. The smallest of fields of vision is defined by the user's ability to rotate his eye and therefore to sweep his fovea beyond the outside world. The maximum rotation is on the order of ± 50 ° forward, so that the field of view (the dynamic foveal field of view) is approximately 100 °. The middle of the three fields of view is the static forward field of view and includes both the user's foveal and peripheral vision. As discussed above, this field of view (the static foveal + peripheral field of view) is in the order of 150 °. The largest of the three fields of vision is the one defined by the user's ability to rotate his eye and, therefore, to sweep his foveal + peripheral vision beyond the outside world. Based on a maximum rotation on the order of ± 50 ° and a static foveal + peripheral field of view on the order of 150 °, this largest field of view (dynamic foveal + peripheral field of view) is on the order of 200 °. This growing range of fields of view from at least 100 degrees to at least 150 degrees and then at least 200 degrees provides corresponding benefits to the user in terms of his ability to view images generated by an image viewing system in an intuitive and Natural.
[0010] [0010] There is, therefore, a need for head-mounted displays that have improved compatibility with the field of view, both static and dynamic, of the human eye. The present description addresses this need and provides head-mounted displays that employ reflective optical surfaces that provide an ultra-wide viewing angle. Definitions
[0011] [0011] In the remainder of this description and in the claims, the phrase "virtual image" is used in its optical sense, that is, a virtual image is an image that is perceived as originating from a specific place where, in fact, the light that is perceived does not originate from that place.
[0012] [0012] An FS / UWA / RO surface is referred to here as a "free space" surface because its local spatial positions, local surface curvatures, and local surface orientations are not linked to a specific substrate, such as the xy plane , but instead, during surface design, they are determined using fundamental optical principles (for example, the Fermat and Hero shortest time principle) applied in three-dimensional space.
[0013] [0013] The FS / UWA / RO surface is referred to as an "ultra-wide angle" surface because, during use, at a minimum, it does not limit the dynamic foveal field of view of a nominal user's eye. Thus, depending on the optical properties of the optional optical components that can be used with the "ultra-wide angle" surface, for example, a Fresnel lens system, the entire HMD optical system can be non-pupil-forming optical systems , that is, different from conventional ones that have an exit pupil that limits the user's field of view, the operative pupil for various embodiments of the optical systems described here being the entrance pupil of the user's eye, as opposed to that associated with the system external optical. Concomitantly, for these embodiments, the field of view provided for the user will be much larger than conventional optical systems where even a small misalignment of the user's eye with the exit pupil of the external optical system can substantially reduce the available information content for the user and greater misalignment can cause the whole image to disappear.
[0014] (1) A locução "superfície óptica refletiva" (também referida aqui como uma "superfície refletiva") irá incluir superfícies que são apenas refletivas, bem como superfícies que são tanto refletivas quanto transmissivas. Em cada caso, a refletividade poderá ser apenas parcial, isto é, parte da luz incidente pode ser transmitida através da superfície. Do mesmo modo, quando a superfície for tanto refletiva quanto transmissiva, a refletividade e/ou a transmissividade poderão ser parciais. Conforme discutido abaixo, uma única superfície óptica refletiva poderá ser usada para ambos os olhos ou cada olho poderá ter sua própria superfície óptica refletiva individual. Outras variações incluem o uso de múltiplas superfícies ópticas refletivas para ambos os olhos ou individualmente para cada olho. Combinações de mesclar e associar poderão ser também usadas, por exemplo, uma única superfície óptica refletiva poderá ser usada para um olho e múltiplas superfícies ópticas refletivas para o outro olho. Como uma alternativa adicional, uma ou múltiplas superfícies ópticas refletivas poderão ser providas para um único olho do usuário. As reivindicações apresentadas abaixo se destinam a cobrir estas e outras aplicações das superfícies ópticas refletivas descritas aqui. Em particular, cada reivindicação que exige uma superfície óptica refletiva se destina a cobrir o aparelho de visualização montado na cabeça que inclui uma ou mais superfícies ópticas refletivas do tipo especificado. (2) A locução "um sistema de visualização de imagem apresentando pelo menos uma superfície de emissão de luz" é usada geralmente como incluindo qualquer sistema de visualização apresentando uma superfície que emite luz, seja por transmissão de luz através da superfície, por geração de luz na superfície (por exemplo, por um arranjo de LEDs), por reflexão da superfície de luz de outra fonte, ou semelhante. O sistema de visualização de imagem pode empregar um ou múltiplos dispositivos de visualização de imagem, por exemplo, um ou múltiplos arranjos de LEDs e/ou LCDs. Como com superfícies ópticas refletivas, um determinado aparelho de visualização montado na cabeça pode incorporar um ou mais sistemas de visualização de imagem para um ou ambos os olhos do usuário. Novamente, cada das reivindicações apresentadas abaixo que exige um sistema de visualização de imagem se destina a cobrir o aparelho de visualização montado na cabeça que inclui um ou mais sistemas de visualização de imagem do tipo especificado. (3) A locução "visor binocular" indica um aparelho que inclui pelo menos um elemento óptico separado (por exemplo, um dispositivo de visualização e/ou uma superfície óptica refletiva) para cada olho. (4) A locução "campo de visão" e sua abreviação FOV se referem ao campo de visão "aparente" no espaço de imagem (olho), conforme oposto ao campo de visão "real" no espaço de objeto (isto é, visor). [0014] Throughout this description, the following phrases / terms will have the following meanings / scope: (1) The phrase "reflective optical surface" (also referred to here as a "reflective surface") will include surfaces that are only reflective, as well as surfaces that are both reflective and transmissive. In each case, the reflectivity may be only partial, that is, part of the incident light can be transmitted across the surface. Likewise, when the surface is both reflective and transmissive, the reflectivity and / or the transmissivity may be partial. As discussed below, a single reflective optical surface can be used for both eyes or each eye can have its own individual reflective optical surface. Other variations include the use of multiple reflective optical surfaces for both eyes or individually for each eye. Combinations of merge and associate may also be used, for example, a single reflective optical surface may be used for one eye and multiple reflective optical surfaces for the other eye. As an additional alternative, one or multiple reflective optical surfaces can be provided for a single eye of the user. The claims presented below are intended to cover these and other applications of the reflective optical surfaces described here. In particular, each claim requiring a reflective optical surface is intended to cover the head-mounted display device that includes one or more reflective optical surfaces of the specified type. (2) The phrase "an image display system featuring at least one light-emitting surface" is generally used as including any display system featuring a surface that emits light, either by transmitting light across the surface, by generating light on the surface (for example, by an array of LEDs), by reflection from the light surface from another source, or the like. The image display system can employ one or multiple image display devices, for example, one or multiple arrays of LEDs and / or LCDs. As with reflective optical surfaces, a particular head-mounted display device may incorporate one or more image display systems for one or both of the user's eyes. Again, each of the claims presented below that requires an image display system is intended to cover the head-mounted display device that includes one or more image display systems of the specified type. (3) The phrase "binocular display" indicates an apparatus that includes at least one separate optical element (for example, a display device and / or a reflective optical surface) for each eye. (4) The phrase "field of view" and its abbreviation FOV refer to the "apparent" field of view in the image space (eye), as opposed to the "real" field of view in the object space (ie, viewfinder) .
[0015] (I) uma armação (107) adaptada para ser montada na cabeça de um usuário (105); (II) um sistema de visualização de imagem (110) sustentado pela armação (107) (por exemplo, a armação sustenta o sistema de visualização de imagem em uma localização fixação que, durante o uso do HMD, está fora do campo de visão do usuário); e (III) uma superfície óptica refletiva (120) sustentada pela armação (107), a superfície óptica refletiva (120) sendo uma superfície contínua que não é giratoriamente simétrica em torno de nenhum eixo de coordenadas de um sistema de coordenadas Cartesiano tridimensional (por exemplo, a superfície óptica refletiva pode ser uma superfície óptica refletiva de ângulo ultra-amplo e de espaço livre (120) que não é giratoriamente simétrica (não é uma superfície de revolução) em torno dos eixos x, y ou z de um sistema de coordenadas Cartesiano tridimensional apresentando uma origem arbitrária); [0015] According to a first aspect, a head-mounted display device (100) is described which includes: (I) a frame (107) adapted to be mounted on a user's head (105); (II) an image visualization system (110) supported by the frame (107) (for example, the frame supports the image visualization system in a fixed location that, during the use of the HMD, is outside the field of view of the user); and (III) a reflective optical surface (120) supported by the frame (107), the reflective optical surface (120) being a continuous surface that is not rotationally symmetrical about any coordinate axis of a three-dimensional Cartesian coordinate system (for example , the reflective optical surface can be an ultra-wide, clear-space reflective optical surface (120) that is not rotationally symmetrical (not a surface of revolution) around the x, y, or z axes of a coordinate system Three-dimensional Cartesian with an arbitrary origin);
[0016] (a) o sistema de visualização de imagem (110) inclui pelo menos uma superfície de emissão de luz (81); (b) durante o uso, a superfície óptica refletiva (120) produz imagens virtuais espacialmente separadas de porções espacialmente separadas de pelo menos uma superfície de emissão de luz (81), pelo menos uma das imagens virtuais espacialmente separadas sendo angularmente separada de pelo menos uma outra das imagens virtuais espacialmente separadas em pelo menos 100 graus, a separação angular sendo medida a partir de um centro de rotação (17) do olho de um usuário nominal (15); e (c) durante o uso, pelo menos um ponto da superfície óptica refletiva (120) é angularmente separado de pelo menos outro ponto da superfície óptica refletiva (120) em pelo menos 100 graus, a dita separação angular sendo medida a partir do centro de rotação (17) do olho de um usuário (15). [0016] where (a) the image display system (110) includes at least one light emitting surface (81); (b) during use, the reflective optical surface (120) produces spatially separated virtual images of spatially separated portions of at least one light-emitting surface (81), at least one of the spatially separated virtual images being angularly separated from at least another of the virtual images spatially separated by at least 100 degrees, the angular separation being measured from a center of rotation (17) of the eye of a nominal user (15); and (c) during use, at least one point on the reflective optical surface (120) is angularly separated from at least another point on the reflective optical surface (120) by at least 100 degrees, said angular separation being measured from the center of rotation (17) of a user's eye (15).
[0017] (I) uma armação (107) adaptada para ser montada na cabeça de um usuário (105); (II) um sistema de visualização de imagem (110) sustentado pela armação (107) (por exemplo, a armação sustenta o sistema de visualização de imagem em uma localização fixação que, durante o uso do HMD, está fora do campo de visão do usuário); e (III) uma superfície óptica refletiva de ângulo ultra-amplo e de espaço livre (120) sustentada pela armação (107); [0017] According to a second aspect, a head-mounted display device (100) is described which includes: (I) a frame (107) adapted to be mounted on a user's head (105); (II) an image visualization system (110) supported by the frame (107) (for example, the frame supports the image visualization system in a fixed location that, during the use of the HMD, is outside the field of view of the user); and (III) an ultra-wide, free-space reflective optical surface (120) supported by the frame (107);
[0018] (a) o sistema de visualização de imagem (110) inclui pelo menos uma superfície de emissão de luz (81); (b) durante o suo, a superfície óptica refletiva de ângulo ultra-amplo e de espaço livre (120) produz imagens virtuais espacialmente separadas de porções espacialmente separadas de pelo menos uma superfície de emissão de luz (81), pelo menos uma das imagens virtuais espacialmente separadas sendo angularmente separada de pelo menos uma outra das imagens virtuais espacialmente separadas em pelo menos 100 graus, a separação angular sendo medida a partir de um centro de rotação (17) do olho de um usuário nominal (15). [0018] where (a) the image display system (110) includes at least one light emitting surface (81); (b) during sweating, the ultra-wide, free-space reflective optical surface (120) produces spatially separate virtual images of spatially separate portions of at least one light-emitting surface (81), at least one of the images spatially separated virtual images being angularly separated from at least one other of the virtual images spatially separated by at least 100 degrees, the angular separation being measured from a center of rotation (17) of a nominal user's eye (15).
[0019] (I) uma armação (107) adaptada para ser montada na cabeça de um usuário (105); (II) um sistema de visualização de imagem (110) sustentado pela armação (107); e (III) uma superfície refletiva (120) sustentada pela armação (107), a superfície refletiva (120) provendo um campo de visão para um usuário nominal de pelo menos 200°; [0019] According to a third aspect, a head-mounted visualization device (100) is described which includes: (I) a frame (107) adapted to be mounted on a user's head (105); (II) an image display system (110) supported by the frame (107); and (III) a reflective surface (120) supported by the frame (107), the reflective surface (120) providing a field of view for a nominal user of at least 200 °;
[0020] (i) por pelo menos uma direção do olhar (na direção 88 na Figura 8) de um olho (71) do usuário nominal, a luz da primeira região de emissão de luz (82) reflita da primeira região refletiva (84) e entre no dito olho (71) para formar uma imagem virtual visível (88) do primeiro conteúdo de informação (isto é, há uma direção do olhar em que um usuário nominal pode ver o primeiro conteúdo de informação (e, opcionalmente, o segundo conteúdo de informação)); (ii) por pelo menos uma direção do olhar (na direção 89 na Figura 8) do dito olho (71), a luz da segunda região de emissão de luz (83) reflita a segunda região refletiva (86) e entre no dito olho (71) para formar uma imagem virtual visível (89) do segundo conteúdo de informação (isto é, há uma direção do olhar onde o usuário nominal pode ver o segundo conteúdo de informação (e, opcionalmente, o primeiro conteúdo de informação)); e (iii) por pelo menos uma direção do olhar (uma direção do olha para a direita de 88 na Figura 89) do dito olho (71), a luz da primeira região de emissão de luz (82) reflita da primeira região refletiva (84) e entre no dito olho (71) para formar uma imagem virtual visível (88) do primeiro conteúdo de informação e a luz da segunda região de emissão de luz (83) reflita da segunda região refletiva (86) e não entre no dito olho (71) e não forme uma imagem virtual visível do segundo conteúdo de informação (isto é, há uma direção do olhar onde o usuário nominal pode ver o primeiro conteúdo de informação, mas não pode ver o segundo conteúdo de informação com a visão foveal ou periférica do usuário nominal). [0020] where: (a) the image display system (110) includes at least one light-emitting surface (81) which includes at least one spatially separate first and second light-emitting regions (82, 83) with, respectively, a first and second information content; (b) the reflective surface (120) comprises at least a first and a second spatially separated reflective regions (84, 86) showing, respectively, a first and a second normal surface (85, 87) that point in different directions; and (c) the frame (107) supports the image display system (110) and the reflective surface (120), so that, during use of the device by a nominal user: (i) for at least one direction of the look (in the 88 direction in Figure 8) of an eye (71) of the nominal user, the light from the first light emitting region (82) reflects from the first reflective region (84) and between in said eye (71) to form a visible virtual image (88) of the first information content (that is, there is a look direction in which a nominal user can see the first information content (and, optionally, the second information content) information)); (ii) for at least one direction of the look (in the direction 89 in Figure 8) of said eye (71), the light from the second region of light emission (83) reflects the second reflective region (86) and enters said eye (71) to form a visible virtual image (89) of the second information content (that is, there is a look direction where the nominal user can see the second information content (and, optionally, the first information content)); and (iii) by at least one direction of the eye (a direction of the eye to the right of 88 in Figure 89) of said eye (71), the light from the first region of light emission (82) reflects from the first reflective region (84 ) and enter said eye (71) to form a visible virtual image (88) of the first information content and the light from the second light emitting region (83) reflects from the second reflective region (86) and do not enter said eye (71) and does not form a visible virtual image of the second information content (that is, there is a direction of the look where the nominal user can see the first information content, but cannot see the second information content with the foveal view or peripheral of the nominal user).
[0021] (1) um vetor (77, 78) da região refletiva local (84, 86) (por exemplo, do centro da região refletiva local) para sua área de conteúdo associado (82, 83) (por exemplo, para o centro de sua área de conteúdo associado); e (2) um vetor (79, 80) da região refletiva local (84, 86) (por exemplo, do centro da região refletiva local) para a localização de um centro de rotação (72) do olho (71) de um usuário nominal durante o uso do aparelho de visualização montado na cabeça (100). [0021] According to a fourth aspect, a computer-based method for designing a reflective optical surface (120), which may or may not be an FS / UWA / RO surface, is described for use in a head-mounted display (100 ) which includes an image display system (110) which, when using the head-mounted display (100), presents a plurality of content areas (82, 83) (for example, a plurality of individual pixels or a plurality of groupings of individual pixels), the method including using one or more computers to perform the steps of: (a) dividing the reflective optical surface (120) into a plurality of local reflective regions (84, 86), each local reflective region having a surface normal (85, 87) (for example, a surface normal in the center the local reflective region); (b) associating each local reflective region (84, 86) of the reflective optical surface (120) with one and only one content area (82, 83) of the image display system (110), each content area (82, 83) being associated with at least one local reflective region (84, 86); and (c) adjust the configuration of the reflective optical surface (120) (for example, adjusting the local spatial location and / or the local curvature of the surface) so that each of the surface normals (85, 86) divides the next two vectors into two parts: (1) a vector (77, 78) from the local reflective region (84, 86) (for example, from the center of the local reflective region) to its associated content area (82, 83) (for example, to the center of its associated content area); and (2) a vector (79, 80) of the local reflective region (84, 86) (for example, from the center of the local reflective region) for the location of a center of rotation (72) of the eye (71) of a nominal user when using the head mounted display device (100).
[0022] [0022] In certain embodiments of the aspects above the description, a separate reflective surface and / or a separate image viewing system is used for each of the user's eyes. In other embodiments, the reflective optical surface, either alone or in combination with other optical components (for example, one or more Fresnel lenses), collimates (or substantially collimates) the light of the image viewing system, such that the collimation is achieved through the local radii of curvature of the surface.
[0023] [0023] In various embodiments, the HMD device can provide the user with a total foveal dynamic field of view, a static foveal + peripheral field of view, or a dynamic foveal + total peripheral field of view.
[0024] [0024] In several embodiments, the HMD apparatus for being a binocular non-pupil formation system in which the eye is free to move around its rolling center through all its normally obtained angular extensions without being limited to look through of an external pupil. Previous HMD devices claimed that they had or could have a wide field of view, but these devices included an external pupil through which the eye has to look. Although there is an ample amount of information provided for the eye, if the eye turns, the information will fade. This is a fundamental problem with pupil formation systems that is avoided in the embodiments of the present description that employ reflective surfaces and, in particular, FS / UWA / RO surfaces.
[0025] [0025] The reference numbers used in the summaries above aspects of the invention (whose reference numbers are representative and not all inclusive or exhaustive) are for the convenience of the reader only and are not intended to be and should not be construed as limiting the scope of the invention. In more general terms, it will be understood that both the previous general description and the following detailed description are merely exemplary of the invention and are intended to provide an overview or structure for understanding the nature and character of the invention.
[0026] [0026] Additional features and advantages of the invention are presented in the following detailed description, and will, in part, be readily apparent to those skilled in the art from that description or will be recognized with the practice of the invention, as exemplified by the description here. The accompanying drawings are included to provide a further understanding of the invention, and are incorporated and form a part of this specification. It will be understood that the various features of the invention described in this specification and in the drawings can be used in any and all combinations. Brief Description of Drawings
[0027] [0027] Figure 1 is a schematic diagram showing the basic components of an HMD, that is, a viewfinder, a reflective surface, and a user's eye.
[0028] [0028] Figure 2 is a side view of a display device mounted on the head according to an exemplary embodiment.
[0029] [0029] Figure 3 is a front view representation of the display device mounted on the head of Figure 2.
[0030] [0030] Figure 4 is a ray diagram that illustrates the light paths in a visualization device mounted on the head from both a viewfinder and an external object according to an exemplary embodiment.
[0031] [0031] Figure 5 is a ray diagram illustrating an exemplary embodiment that employs a curved viewfinder and a curved reflector.
[0032] [0032] Figure 6 is a top view of a head-mounted display device that illustrates the use of two curved reflective optical surfaces corresponding to a user's two eyes according to an exemplary embodiment.
[0033] [0033] Figure 7 is a schematic diagram that illustrates a static field of view from a user's eye to a forward look direction.
[0034] [0034] Figure 8 is a schematic diagram that illustrates the interaction between the static field of view of Figure 7 with an FS / UWA / RO surface according to an illustrative embodiment. The arrows in Figure 8 illustrate light propagation directions.
[0035] [0035] Figure 9 is a ray diagram that illustrates a light path of a certain pixel in a viewfinder as it is reflected in the direction of an eye according to an illustrative embodiment.
[0036] [0036] Figure 10 is a ray diagram that illustrates two pixel light paths in a viewfinder as they are reflected in the direction of an eye according to an exemplary embodiment.
[0037] [0037] Figure 11 is a diagram that illustrates variables used in the selection of the direction of the local normal of a reflector according to an exemplary embodiment.
[0038] [0038] Figure 12 is a representation of a curved reflector along light paths according to an exemplary embodiment.
[0039] [0039] Figures 13 and 14 illustrate, from two perspectives, an FS / UWA / RO surface according to an exemplary embodiment.
[0040] [0040] Figures 15 and 16 illustrate, from two perspectives, another FS / UWA.RO surface according to an exemplary embodiment.
[0041] [0041] Figure 17 is a schematic diagram that illustrates the geometry for calculating a local normal on a reflective surface according to an exemplary embodiment. Detailed Description
[0042] [0042] Figures 2 and 3 are, respectively, a side view and a front view of a head mounted display device 100 shown being used by a user 105. The head mounted display device employs an FS / UWA / RO 120.
[0043] [0043] In one embodiment, the head-mounted visualization apparatus 100 may, for example, be a diaphanous augmented reality optical binocular display. Due to the fact that a diaphanous augmented reality optical binocular display is typically the most complex form of an HMD, the present description will mainly discuss the embodiments of this type, with the understanding that the principles discussed here are equally applicable to diaphanous reality optical monocular displays. diaphanous augmented reality monocular and binocular displays, and binocular and monocular "virtual reality" systems.
[0044] [0044] As shown in Figures 2 and 3, the head mounted visualization apparatus 100 includes a frame 107 adapted for use by the user and supported by the user's nose and ear in a similar manner to that when wearing glasses. In the embodiment of Figures 2-3, as well as in the other embodiments described here, the head-mounted display apparatus may have a variety of configurations and may, for example, resemble swimming goggles, goggles, helmets, and the like. In some embodiments, a strip can be used to secure the frame of the HMD in a fixed position with respect to the user's eyes. In general terms, the outer surface of the HMD enclosure can take any shape that retains the optical system in the required orientation with respect to the HMD display (s) and the eyes of the user.
[0045] [0045] The head-mounted display apparatus 100 includes at least one image display system 110 and at least one optical system that includes a reflective optical surface, which, as shown in Figures 2 and 3, is a reflective optical surface of ultra-wide and clear angle 120, that is, an FS / UWA / RO 120 surface, which, by necessity, is curved. In some embodiments, the FS / UWA / RO surface can be the entire optical system. The surface 120 can be purely reflective or it can have both reflective and transmissive properties, in which case, it can be considered as a type of "beam splitter".
[0046] [0046] The FS / UWA / RO 120 surface can completely surround one or both eyes, as well as at least one 110 image viewing system. In particular, the surface can curve around the sides of the eyes and in the direction sides of the face to expand the available horizontal field of view. In one embodiment, the FS / UWA / RO 120 surface can extend up to 180 ° or more (for example, more than 200 °), as best seen in Figure 6 discussed below. As shown in Figure 3, the HMD can include two FS / UWA / RO 120R and 120L surfaces for the user's two eyes that are separately supported by the frame and / or a nasal recess 210 (see below). Alternatively, the HMD can employ a single FS / UWA / RO surface that serves both eyes with a single structure, some portions of which are seen by both eyes and other portions of which are seen by only one eye.
[0047] [0047] As noted immediately above and as shown in Figure 3, the head-mounted visualization apparatus 100 may include a nasal recess piece 210. The nasal recess piece may be a vertical wall or bar that provides a separation between two surfaces FS / UWA / RO, one for each user's eye. The nasal recess 210 can also provide a separation between the fields of vision of the user's eyes. In this way, the user's right eye can be shown in a first representation of three-dimensional physical reality in the environment by displaying a first image for the first eye via a first image display device and a first FS / UWA / RO surface, while the The user's left eye is shown in a second representation of three-dimensional physical reality in the environment and displaying a second image for the left eye via a second image display device and a second FS / UWA / RO surface. A combination of separate viewing device / reflective surface thus serves each user's eye, with each eye seeing the correct image for its location relative to the three-dimensional physical reality in the environment. With the separation of the user's two eyes, the recessed part 210 allows the image applied to each eye to be optimized independently of the other eye. In one embodiment, the vertical wall of the nasal indentation piece may include two reflectors, one on each side, to allow the user to see the images as he / she turns his or her eyes to the nose, either to the left or to the right. .
[0048] [0048] At least one image display system 110 can be mounted within the FS / UWA / RO 120 surface and can be horizontally arranged or at a slight angle to the horizon. Alternatively, at least one image display system can be located just outside the FS / UWA / RO surface. The tilt or angle of at least one image display system 110 or, more particularly, at least its light-emitting surface, will, in general, be a function of the location of the pixels, images and / or viewing information that are reflected from the surface 120.
[0049] [0049] In certain embodiments, the head mounted display apparatus 100 is configured to create an internal cavity, with the FS / UWA / RO surface being reflective into the cavity. For an FS / UWA / RO surface showing transmissive properties, the image or the visualization information of at least one image visualization system is reflected in the cavity and in the user's eye from the surface while, simultaneously, the light also enters the user's eye cavity from the outside world with passage through the reflective surface.
[0050] [0050] As discussed in detail below, in certain embodiments, at least one image visualization system 110 provides images and / or visualization information that, before entering the user's eye (s), are adjusted for the up close view. In some embodiments, a lens or optical lens system 115 can contribute to this adjustment. Copending U.S. Patent Application is assigned to the same assignee as this application No. 13 / 211,365, deposited simultaneously with this one in the name of G. Harrison, D. Smith, and G. Wiese, entitled "Head Mounted Display Device Using One or More Fresnel Lenses", and identified by the IS attorney's document number -00307, the contents of which are hereby incorporated by reference, describes the use of one or more Fresnel lenses for this purpose. Other embodiments do not use the lens or optical lens system, and instead rely on the FS / UWA / RO surface to provide the desired optical properties for near-eye viewing of images formed by the viewing system.
[0051] [0051] The head-mounted display device may include an electronics package 140 to control images that are displayed by at least one image display system 110. In one embodiment, the electronics package 140 includes accelerometers and gyros that provide location, orientation and position information needed to synchronize images from at least one image display projection system 110 with user activities. The power and video to and from the head mounted display device 100 can be provided via a transmission cable 150 coupled to electronics package 140 or wirelessly.
[0052] [0052] A set of cameras 170 can be located on opposite sides of the head-mounted display device 100 to provide input to the electronics package to help control the generation, for example, of "augmented reality" scenes from the computer. tator. Camera set 170 can be coupled to electronics package 140 to receive power and control signals and to provide video input to the electronics package software.
[0053] [0053] The image display system used in the head-mounted display device can take many forms, now known or subsequently developed. For example, the system may employ small, high-resolution liquid crystal displays (LCDs), light-emitting diode (LED) displays, and / or organic light-emitting diode (OLED) displays, including flexible OLED screens. In particular, the image display system can employ a high definition, small form factor display device with high pixel density, examples of which can be found in the cell phone industry. A fiber optic bundle can also be used in the image display system. In various embodiments, the image viewing system can be considered to function as a small screen television. If the image display system produces polarized light (for example, in the case where the image display system employs a liquid crystal display where all colors are linearly polarized in the same direction), and if the FS / UWA / RO surface is orthogonally polarized with respect to the light emitted by the viewfinder, then the light will not leak out of the FS / UWA / RO surface. The information displayed and the light source itself will therefore not be visible outside the HMD.
[0054] [0054] The entire operation of an exemplary embodiment of an optical system built in accordance with the present description, specifically, an optical system for an "augmented reality" HMD, is illustrated by the ray traces of Figure 2, specifically, the rays of light 180, 185 and 190. In this embodiment, the FS / UWA / RO 120 surface has both reflective and transmissive properties. Using the transmissive properties of surface 120, the beam of light 190 enters the environment through the surface and proceeds to the user's eye. From the same region of the surface 120, the light beam 180 is reflected by the surface (using the reflective properties of the surface) and joins the light beam 190 to create a combined light beam 185 that will enter the user's eye when the the user looks in the direction of point 195, that is, when the direction of the user's eye is in the direction of point 195. While looking in this way, the user's peripheral vision capabilities will allow the user to see the light from other points in the environment passing through surface 120, again using the transmissive properties of the surface.
[0055] [0055] Figure 4 is a ray tracing drawing that illustrates the operation of an exemplary embodiment of the head mounted visualization apparatus 100 disposed thereon. In this embodiment, the entire vision system includes three parts: (1) at least one image viewing system 110, (2) the FS / UWA / RO 120 surface, and (3) the 310 use eye. The 310 eye is represented with an internal lens 330. The light emitted from a pixel of at least one image display system 110 is represented by ray 180, as shown in Figure 2. This light will appear at a point on the user's eye retina after be reflected by surface 120 since the user's gaze direction and associated field of view (see the discussion of Figures 7 and 8 below) includes the point where radius 180 reaches surface 120. More particularly, as discussed below , due to the optical properties that involve the normal that divides the vectors in two parts with respect to the eye and the pixel of the point on the FS / UWA / RO surface, the pixel will only appear at point 195, that is, even if the light is radiated from the pixel into a wider cone, the FS / UWA / RO surface is designed only after so that the light comes from a location.
[0056] [0056] In Figure 4, it is assumed that the direction of the user's gaze is in the direction of the point of intersection of ray 180 with surface 120, as illustrated by the rays of light 185 and 340. What the eye sees, however, it is a virtual image that appears in the space in front of him, at a distance represented by vectors 345 and 350, for example, at infinity, as shown by reference numeral 352. In Figure 4, a chair is used for illustration purposes, with at least one image visualization system 110 producing a real image 355 of the chair that becomes a virtual 360 image after the reflection of the light emitted from the visualization system by the FS / UWA / RO 120 surface. In an "augmented reality" environment , the optical system, including the FS / UWA / RO surface could, for example, make the virtual 360 image of the chair appear to be in the same location as a person 365 truly in the physical environment. It is noted that radius 345, which stops at a distance closer to infinity, is included in Figure 4 as showing that the image can be formed to look optically at any distance between its surroundings and infinity. For example, the person could be standing 50 meters apart, and this is where the chair would be placed.
[0057] [0057] In Figures 1-4, at least one image display system is shown as having a planar light emitting surface (for example, surface 111 in Figure 4). The visualization system may also have a curved light-emitting surface. Such an embodiment is shown in Figure 5, where a beam of light 405 emanates from a curved viewing screen 407 (curved light emitting surface). This ray reflects from the FS / UWA / RO 120 surface and enters pupil 415 of the user's eye 310 (see radius 410). In this embodiment, the surface 120 also admits light represented by the radius 345 from the external environment, thus allowing images generated by the viewfinder to be superimposed on the external images. It is noted that, for purposes of illustration, radius 345 is shown offset from radius 410; for a simple overlay of the external image, radius 345 will be superimposed on radius 410.
[0058] [0058] As discussed above, previous optical systems used in HMDs that employed reflective optical surfaces had pupil formation and therefore had limited viewing areas, a typical field of view being ~ 60 degrees or less. This greatly limited the value and capacity of previous head mounted displays. In various embodiments, the head-mounted displays described here have much wider fields of view (FOV), allowing much more optical information to be provided to the user compared to HMDs with smaller fields of view. The wide field of view can be greater than 100 °, greater than 150 °, or greater than 200 °. In addition to providing more information, the wide field of view allows additional information to be processed by the user in a more natural way, allowing for better immersive and augmented reality experiences through a better association of the images displayed with virtual reality.
[0059] [0059] Specifically, in the exemplary embodiment illustrated in Figure 6, for a forward looking direction, the eye can cover a complete viewing area represented in Figure 6 by FS / UWA / RO 201 and 202 surfaces, corresponding to at least 150 degrees horizontal field of view (FOV) for each eye (for example, ~ 168 degrees horizontal FOV). This field of view is composed of the foveal field of view and the peripheral field of view of the eye. In addition, the eye can move freely around its center of rotation to reach the combined foveal + peripheral field of view in different directions of the gaze, as the eye naturally does, when visualizing the physical world. The optical systems described here thus allow the eye to obtain information over a whole range of movement in the same way that the eye does, when viewing the natural world.
[0060] [0060] Examining Figure 6 in greater detail, this figure is a simplified line representation of the front of the user's head 200, as seen from the top. It shows FS / UWA / RO 201 and 202 surfaces placed in front of the user's eyes 203 and 204. As discussed above, the FS / UWA / RO 201 and 202 surfaces can rest on the user's nose 205 when they come together in the central front 214 of the user's head 200. As discussed in greater detail below, the local normals and local spatial locations of surfaces 201 and 202 are adjusted so that images produced by at least one image viewing system (not shown) in Figure 6) cover at least 100 °, for example, in certain embodiments, at least 150 ° and, in other embodiments, at least 200 °, of the horizontal FOV for each eye. (Optionally, as discussed also below, local radii of curvature are also adjusted to provide, when combined with a Fresnel lens system, distant virtual images.) For example, local normals and local spatial locations can be adjusted to cover the complete static, horizontal field of view at ~ 168 degrees for each eye, with the 168 degrees extending from edge to edge of the FS / UWA / RO 201 or 202 surfaces, as shown by visual fields 210, 211 and 212, 213 The visual fields therefore correspond to the broad static field of view (foveal + peripheral) that is provided for the user. In addition, the user is free to move their eyes around rolling centers 215 and 216, while continuing to view the images generated by the computer.
[0061] [0061] In Figure 6, as well as in Figures 4, 5 and 12, the FS / UWA / RO surfaces are shown as parts of spheres for ease of presentation. In practice, surfaces are not spheres, but have more complex configurations so that their normal locations and local spatial locations (and, optionally, local radii of curvature) will provide the desired static and dynamic field of view (and, optionally, desired distances for virtual images). Also, in Figure 6, the right side of the head mounted display device operates identically to the left side, it being understood that the two sides may differ, if desired for specific applications.
[0062] [0062] Figures 7 and 8 additionally illustrate the static and dynamic fields of view provided by the FS / UWA / RO surfaces described here. Figure 7 shows the user's nominal right eye 71 showing a forward eye direction 73. The foveal + peripheral field of view of the eye is shown by arc 75, which has an angular extension of ~ 168 °. It is noted that, for presentation purposes, in Figures 6-8, the field of view is shown relative to the center of rotation of the user's eye, as opposed to the center or edges of the user's pupil. In fact, the large field of view (for example, ~ 168 °) obtained by a human eye is a result of the large angular extension of the retina that allows highly oblique rays to enter the user's pupil and reach the retina.
[0063] [0063] Figure 8 schematically shows the interaction of the field of view of Figure 7 with an HMD presenting: (a) an image visualization system whose at least one light emitting surface 81 has a first light emitting region 82 (illustrated as a square) and a second light emitting region 83 (illustrated as a triangle), and (b) an FS / UWA / RO surface featuring a first reflective region 84 that has a first local normal 85 and a second reflective region 86 showing a second local normal 87.
[0064] [0064] As indicated above, the FS / UWA / RO surface is both a "free space" surface and an "ultra-wide angle" surface. In addition, as noted above and discussed in greater detail below, the surface can participate (or be the only source) in the collimation (or partial collimation) of the light that enters the user's eye. Such collimation makes the virtual image produced by the FS / UWA / RO surface appear to be located at a long distance from the user, for example, at 30 meters or more, which allows the user to easily focus on the virtual image with a relaxed eye. .
[0065] [0065] The "free space" and ultra-wide angle "aspects of the FS / UWA / RO surface can be achieved by adjusting the normal surface locations so that the user's eye sees the light emitting regions of the hair at least one image visualization system as originating from predetermined regions of the FS / UWA / RO surface (predetermined locations on the surface).
[0066] [0066] For example, in Figure 8, the HMD designer could decide that it would be advantageous if a virtual image 88 of the square was seen by the central portion of the user's retina, when the direction of the user's gaze was forward, and that an image virtual 89 of the triangle was seen by the central portion of the user's retina, when the direction of the gaze was, for example, ~ 50 ° to the left forward. The designer would then configure that at least one image viewing system, the FS / UWA / RO surface, and any other optical components of the system (for example, one or more Fresnel lenses between the image viewing system and the FS surface / UWA / RO) so that the virtual image of the square was forward and the virtual image of the triangle was 50 ° to the left forward during the use of the HMD.
[0067] [0067] In this way, when the direction of the user's eye (field of view) directly intersects the FS / UWA / RO surface, the virtual image of the square would be visible in the center of the user's eye, as desired, and when the direction of the user's gaze (field of view) intersecting the FS / UWA / RO surface 50 degrees to the left onwards, the virtual image of the triangle would be visible in the center of the user's eye, also as desired. Although not illustrated in Figures 7 and 8, some approaches are used for the vertical field of view, as well as for off-axis fields of view. More generally, in the design of the HMD and each of its optical components, the designer "maps" at least one light-emitting surface of the display to the reflective surface, so that the desired portions of the display are visible to the eye of use, when the eye is looking in specific directions. Thus, as the eye sweeps across the field of view, both horizontally and vertically, the FS / UWA / RO surface shines different portions of at least one light-emitting surface of the image viewing system in the eye of the user. Although the previous discussion was in terms of the center of a nominal user's retina, the design process can, of course, use the location of a nominal user's fovea instead, if desired.
[0068] [0068] It should be noted that, in Figure 8, any rotation of the user's eye to the right makes the virtual image 89 of the triangle no longer visible to the user. Thus, in Figure 8, any direction of the gaze that is forward or to the left forward provides the user with virtual images of both the square and the triangle, while a direction of the gaze towards the forward direction provides a virtual image of the square only. The accuracy of the virtual images will, of course, depend on whether the virtual images are perceived by the user's foveal vision or by the user's peripheral vision.
[0069] [0069] If the HMD designer placed the virtual image of the distant square on the right in Figure 8, while leaving the virtual image of the distant triangle on the left, there would be directions of the gaze where only the virtual image of the square would be visible and other directions of just look where the virtual image of the triangle would be visible. Likewise, based on the principles described here, the designer could arrange the virtual image of the square and the virtual image of the triangle so that the virtual image of the triangle was always visible, with the virtual image of the triangle in some directions of the eye, but not for others. As an additional variation, the HMD designer could place the virtual image of the square and triangle in locations where, for one or more directions of the gaze, no image would be visible to the user, for example, the designer could place the virtual images on the very side from outside the user’s static field of view to a forward look direction. The flexibility provided to the HMD designer by the present description is, therefore, readily evident.
[0070] [0070] In one embodiment, the "free space" and "ultra-wide angle" aspects of the reflective surface are achieved using the principles of Fermat and Hero according to what the light travels along the optical path more short (less time). The copending US Patent Application and assigned to the same assignee as the present application on behalf of G. Harrison, D. Smith and G. Wiese, entitled "Methods and Systems for Creating Free Space Reflective Optical Surfaces", and identified by the document number Prosecutor IS-00354, the contents of which are hereby incorporated by reference, describes an embodiment in which the principles of Fermat and Hero are used to design the FS / UWA / RO surfaces suitable for use in HMDs.
[0071] [0071] Through Fermat and Hero's shorter time principles, any "desired portion" of at least one light-emitting surface of an image display system (for example, any pixel in an image display system) can be forced to have any reflection point on the FS / UWA / RO surface, since the optical path of the desired portion of at least one light-emitting surface to the reflection point on the FS / UWA / RO surface and then to the center of rotation of the user's eye is at a more extreme point.
[0072] [0072] A more extreme point in the optical path means that the first derivative of the optical path length has reached a zero value, meaning a maximum or a minimum in the optical path length. A more extreme point can be inserted at any point in the field of view by creating a local region of the reflective optical surface, the normal of which divides in two parts (a) a vector of the local region for the user's eye (for example, a vector of the center of the local region to the eye of the user), and (b) a vector of the local region for the "desired portion" of the light-emitting surface (for example, a vector from the center of the local region to the center of the "desired portion" "of the light-emitting surface). Figures 9 and 10 illustrate the process for the case where the "desired portion" of at least one light emitting surface of the image display system is a pixel.
[0073] [0073] Specifically, Figure 9 shows a light emitting surface 510 of an image display system composed of a generally rectangular array of pixels that are emanating light towards the front of the head mounted display device towards the beam of light 515. The beam of light 505 is reflected from the reflective optical surface 520, which, for ease of presentation, is shown as a plane in Figure 8. With reflection, beam of light 515 becomes beam of light 525 that enters the user's eye 530.
[0074] [0074] For the purpose of determining the normal surface of the reflector for each pixel, it is only necessary to determine the three-dimensional bisector of vectors corresponding to the light beams 515 and 525. In Figure 9, this bisector vector is shown in two-dimensional form as the line 535 The bisection vector is normal to the reflective optical surface at the reflection point 540, which is the location on the surface 520 where the pixel 545 of the light emitting surface 510 will be visible to the HMD user.
[0075] [0075] Specifically, in operation, the pixel 545 on the display surface 510 emits the light beam 515 which is reflected from the reflective optical surface 520 at an angle established by the surface normal corresponding to the bisection vector 535 and its perpendicular plane 550 , producing by the principles of Fermat and Hero a pixel reflected in the reflection point 540 that is seen by the eye 530 along the beam of light 525. In order to accurately calculate the surface normal at the reflection point 540, the beam 525 can pass through approximately the center 555 of the user's eye 530. The results will remain approximately stable, even if the user's eye rotates, becoming peripheral vision until, as discussed above in connection with Figures 7 and 8, the eye rotates so much that the viewing region cannot be seen with the user's foveal or peripheral view.
[0076] [0076] To calculate the position of the surface normal, the use of the quaternity method can be employed, where q1 = beam orientation 515 q2 = beam orientation 525 and q3 = the orientation of the desired surface normal 535 = (q1 + q2) / 2
[0077] [0077] The surface normal can also be described in vector notation, as shown in Figure 11. In the following equation and in Figure 11, point N is a unit away from point M in the center of the region of interest of the reflective optical surface and is in the direction of normal perpendicular to the tangent plane of the reflective optical surface at point M. The tangent plane of the reflective optical surface at point M is controlled to satisfy the relationship expressed in the following equation in such a way that, in three-dimensional space, the normal of surface at point M divide the line from point M to point P in the center of the pixel of interest in two and the line from point M to point C in the rolling center of the user's eye (for reference, point C is approximately 13 mm from the front of the eye).
[0078] [0078] The equation that describes point N in the surface normal at point M is:
[0079] [0079] where all points N, M. P and C have components [x, y. z] that indicate their position in three-dimensional space in an arbitrary Cartesian coordinate system.
[0080] [0080] The resulting normal vector N-M has the Euclidean length
[0081] [0081] where the two vertical bars represent the Euclidean length, calculated as follows:
[0082] [0082] As a numerical example, the following values M, P and C should be considered:
[0083] [0083] The point along the normal N is calculated as follows: P - M = [(2-4), (10-8), (5-10)] = [- 2,2, -5] CM = [(6-4), (10-8), (5-10)] = [2, 2. -5] (PM) + (CM) = [0, 4, -10] and = {[-2.2, -5] + [2.2, -5]} / 10.7703296143 + [4,8,10] = [0, 0.3713806764, -0.928476691] + [4,8,10] = [4, 8.3713806764, 9.0715233091]
[0084] [0084] The geometry is shown in Figure 7, where the bisector is between the two longest vectors.
[0085] [0085] The antecedent is, of course, merely a representative calculation that serves to show the use of the Fermat and Hero principles of shorter time in determining angular limitations of the local tangent plane for a field of points that form a variety of space surfaces. free (free form) of reflecting regions designed to present a contiguous virtual image to the viewer. The only real constant is the center of the user's eye, and the natural field of view of the eye. All other components can be iteratively updated until an appropriate solution for a given image viewing system and the reflective optical surface orientation is achieved. Otherwise, the pixel image reflection locations, M1, M2, ..., Mn, and their associated normals and curvatures can be considered as a matrix that is "deformed" (adjusted) so that the FS / surface UWA / RO achieve the desired virtual image processing of computer generated images formed by the image viewing system.
[0086] [0086] With the application of the principles of Fermat and Hero, it should be noted that, in some embodiments, it will be desirable to prevent the situation where the normals are adjusted in such a way that the user sees the same pixel reflection in more than one Score. It should also be noted that, in some embodiments, the local regions of the reflective optical surface may be very small and may even correspond to a spot on the reflector, with the spots becoming other spots to form a smooth surface.
[0087] [0087] In order to ensure that the user can easily focus on the virtual image of the "desired portion" of at least one light-emitting surface (for example, the virtual image of a pixel), the radius of curvature of the region surrounding it the reflection point (reflection area) is controlled so that a collimated (or almost collimated) image reaches the user. The collimated (or almost collimated) image has optical rays that are more parallel, as if the image had originated from a distance away from the user, tens to hundreds of meters, for example. In order to reach such a surface, the radius of curvature of the reflection region of the reflective optical surface corresponding to the "desired portion" of at least one light-emitting surface (desired light-emitting pixel) can be maintained within a radius close to the half the distance from the reflection region to the "desired portion" of the light emitting surface (real pixel) on the display.
[0088] [0088] Thus, in one embodiment, the normal pixel vector interflected from the pixel of interest to the adjacent pixels satisfies the relationship that allows them to establish a radius of curvature of approximately half the length of the vector of the pixel location reflected on the reflective surface for the display pixel. Adjustments that affect this parameter include the size of at least one light-emitting surface and whether at least one light-emitting surface is curved.
[0089] [0089] Figure 10 illustrates this embodiment. In order to control the radius of curvature of the region surrounding the pixel reflection so that a collimated (or almost collimated) image reaches the user, two adjacent pixel reflecting regions are considered, just as at the reflection point 540. More regions can be considered for a better balance, but two are sufficient. With reference to Figure 10, two pixel reflective points 540 and 610 are shown with respect to two pixels, 545 and 615, respectively on the display surface 510. The surface normals at points 540 and 610 are calculated together with the angle between their directions. The radius of curvature is calculated by knowing these angles and the distance between points 540 and 610. Specifically, the surface configuration and, if necessary, the spatial location of the surface are adjusted until the radius of curvature is equal (or approximately equal) at half the average beam lengths 515 and 620. In this way, zero or near zero diopter light can be provided for the user's eye. This is equivalent to light coming from an essentially infinitely distant point, and the light wavefront is flat, resulting in surface normals parallel to the light wavefront.
[0090] [0090] In addition to controlling local radii of curvature, in certain embodiments, as a first-order point solution as having a collimated (or almost collimated) image enters the eye, at least one light-emitting surface is nominally located at a distance of a focal length away from the FS / UWA / RO surface, where the focal length is based on the average value of the radii of curvature of the various reflective regions that form the FS / UWA / RO surface.
[0091] [0091] The result of applying the principles of Fermat and Hero is a set of reflective regions that can be combined into a smooth reflective surface. This surface will not, in general, be spherical or symmetrical. Figure 12 is a two-dimensional representation of such an FS / UWA / RO 520 surface. As discussed above, surface 520 can be constructed in such a way that the radii of curvature at points 710 and 720 are adjusted to values that provide a relaxed view of the image that is reflected from at least one light-emitting surface of the image viewing system that is being reflected by the surface. In this way, looking in a certain direction represented by line 730, a virtual collimated (or almost collimated) image will be provided for eye 530, as looking in a different direction represented by line 740. To allow a smooth transition of vision across the field of view, FS / UWA / RO surface regions can be smoothly moved from one control point to another, as can be done by using Non-Uniform Rational B-Spline (NURBS) technology for striated surfaces, thus creating a transition smooth through the reflective surface. In some cases, the FS / UWA / RO surface may include a sufficient number of regions so that the surface is smooth at a fine grain level. In some embodiments, different magnifications for each portion of the viewfinder (for example, each pixel) can be provided using a gradual gradient to allow for better productivity, performance and image quality.
[0092] [0092] Figures 13 and 14 show an FS / UWA / RO surface created using the above techniques from two different perspectives. Figures 15 and 16 show a refined version of the reflective surface of Figures 13 and 14, again from two perspectives. The FS / UWA / RO surfaces of these figures were designed using the computer-based techniques of the copending order and assigned to the same assignee as this order entitled "Methods and Systems for Creating Free Space Reflective Optical Surfaces", mentioned above.
[0093] [0093] From the foregoing, it can be seen that methods for designing head-mounted displays have been described, which, in exemplary embodiments may include determining a desired field of view, choosing a viewing surface size (for example, dimensions width and height), choose an orientation for the viewing surface with respect to a reflective surface, catalog the position of each pixel on the viewing surface, and choose a location to display each pixel of the viewing surface on the reflecting surface. The viewing surface can be placed above the eye and tilted towards the reflective surface, allowing the curvature of the reflective surface to reflect light into the user's eye. In additional embodiments, the viewing surface can be placed in other positions, such as on the side of the eye or below the eye, with the reflective position and curvature selected to reflect the light from the viewing surface appropriately, or being tilted at a different angle. grade.
[0094] [0094] In certain embodiments, a mathematical representation or three-dimensional instantiation of the reflective surface can be created, as discussed above, with each region of the reflective surface being a local region showing a normal that divides the vectors from the center of that region to the center in two the user’s eye and to the center of a pixel on the viewing surface. As also discussed above, the radii of curvature of regions surrounding a pixel reflection can be controlled so that a collimated (or almost collimated) image reaches the user through the field of view. Through computer-based iterations, changeable parameters (for example, local normals, local curvatures, local spatial locations) can be adjusted until a combination (set) of parameters is identified, which provides a desired level of optical performance over the field vision, as well as a workable design that is aesthetically acceptable.
[0095] [0095] During use, the non-symmetrical FS / UWA / RO surface, which, in certain embodiments, is constructed from a striated surface of multiple local focus regions, forms a virtual image of at least one light-emitting surface of the image viewing system that is stretched across a wide field of view. The FS / UWA / RO surface can be considered as a progressive mirror or progressive curved beam divider or a free-form mirror or reflector. As the eye scans the field of view, both horizontally and vertically, the curved FS / UWA / RO surface reflects light from different portions of at least one light-emitting surface of the image viewing system in the user's eye. In various embodiments, the entire optical system can be manufactured in large quantities at low cost while maintaining an image quality compatible with typical human visual resolution.
[0096] [0096] In terms of the entire structure of the HMD, Table 1 presents the representative non-limiting examples of the parameters that an HMD display built according to the present description will typically meet. In addition, the HMD displays described here will typically have a distance between pixels that is small enough to ensure that a convincing image is established on the user's visual plane.
[0097] (1) Em algumas concretizações, uma ou mais lentes de Fresnel podem ser usadas para modificar as características dióptricas do feixe de luz que emana da superfície de visualização. (2) Em algumas concretizações, a superfície óptica refletiva pode ser semitransparente, permitindo que a luz se origine do ambiente externo. As imagens internas geradas pelo visor podem então se sobrepor à imagem externa. As duas imagens podem ser alinhadas através do uso de equipamento de localização, tais como giroscópios, câmeras, e da manipulação de software das imagens geradas por computador de modo que as imagens virtuais estejam em localizações apropriadas no ambiente externo. Em particular, uma câmera, um acelerômetro, e/ou giroscópios poderão ser usados para ajudar no registro do aparelho, quando estiverem na realidade física e a sobrepor suas imagens na vista externa. Nestas concretizações, o equilíbrio entre a transmitância e a refletância relativas da superfície óptica refletiva pode ser selecionado para prover o usuário com imagens sobrepostas com características de brilho apropriadas. Também nestas concretizações, a imagem de mundo real e a imagem gerada por computador pode parecer a ambas como estando aproximadamente na mesma distância aparente, de modo que o olho possa focar em ambas as imagens de uma vez. (3) Em algumas concretizações, a superfície óptica refletiva é mantida tão fina quanto possível a fim de minimizar os efeitos sobre a posição ou foco de luz externa que passa através da superfície. (4) Em algumas concretizações, o aparelho de visualização montado na cabeça provê um campo de visão para cada olho de pelo menos 100 graus, pelo menos 150 graus, ou pelo menos 200 graus. (5) Em algumas concretizações, o campo de visão estático provido pelo visor montado na cabeça em cada olho não sobrepõe o nariz do usuário em qualquer grau grande. (6) Em algumas concretizações, a superfície óptica refletiva pode empregar uma transição progressiva de sua prescrição óptica através do campo de visão para manter o foco sobre a área de visualização disponível. (7) Em algumas concretizações, o traçado de raios pode ser usado para customizar os parâmetros do aparelho para uma implementação específica, tal como treinamento militar, simulação de voo, jogos e outros aplicativos comerciais. (8) Em algumas concretizações, a superfície óptica refletiva e/ou a superfície do visor, bem como as curvaturas das lentes (quando usadas), e as distâncias entre o visor e a superfície óptica refletiva e entre a superfície óptica refletiva e o olho, podem ser manipuladas com relação a uma especificação de Função de Transferência de Modulação (MTF) na retina e/ou na fóvea. (9) Em algumas concretizações, os HMDs descritos aqui podem ser implementados em aplicações, tais como, mas não limitadas à detecção de francos-atiradores, treinamento comercial, treinamento militar e operações, e fabricação CAD. [0097] Several features that can be included in the head mounted displays described here include, without limitation, the following, some of which have been referenced above: (1) In some embodiments, one or more Fresnel lenses can be used to modify the dioptric characteristics of the beam of light emanating from the viewing surface. (2) In some embodiments, the reflective optical surface can be semitransparent, allowing light to originate from the external environment. The internal images generated by the viewfinder can then overlap the external image. The two images can be aligned using location equipment, such as gyroscopes, cameras, and software manipulation of computer-generated images so that the virtual images are in appropriate locations in the external environment. In particular, a camera, an accelerometer, and / or gyroscopes can be used to help register the device, when they are in physical reality and to superimpose their images on the external view. In these embodiments, the balance between the relative transmittance and reflectance of the reflective optical surface can be selected to provide the user with superimposed images with appropriate brightness characteristics. Also in these embodiments, the real-world image and the computer-generated image may appear to both to be at approximately the same apparent distance, so that the eye can focus on both images at once. (3) In some embodiments, the reflective optical surface is kept as thin as possible in order to minimize the effects on the position or focus of external light that passes through the surface. (4) In some embodiments, the head-mounted display device provides a field of view for each eye of at least 100 degrees, at least 150 degrees, or at least 200 degrees. (5) In some embodiments, the static field of view provided by the head-mounted display in each eye does not overlap the user's nose to any great degree. (6) In some embodiments, the reflective optical surface may employ a progressive transition from your optical prescription across the field of view to maintain focus on the available viewing area. (7) In some embodiments, ray tracing can be used to customize the device's parameters for a specific implementation, such as military training, flight simulation, games and other commercial applications. (8) In some embodiments, the reflective optical surface and / or the viewfinder surface, as well as the curvatures of the lens (when used), and the distances between the viewfinder and the reflective optical surface and between the reflective optical surface and the eye , can be manipulated with respect to a Modulation Transfer Function (MTF) specification on the retina and / or fovea. (9) In some embodiments, the HMDs described here can be implemented in applications such as, but not limited to, sniper detection, commercial training, military training and operations, and CAD manufacturing.
[0098] [0098] Once designed, the reflective optical surfaces described here (for example, FS / UWA / RO surfaces) can be produced, for example, manufactured in quantity, using a variety of techniques and a variety of materials now known or subsequently developed countries. For example, surfaces can be formed from plastic materials that have been metallized to be adequately reflective. Glass or polished plastic materials can also be used. For "augmented reality" applications, reflective optical surfaces can be constructed from a transmissive material with small built-in reflectors, thus reflecting a portion of an incident wavefront while allowing light to be transmitted through the material.
[0099] [0099] For parts of a prototype, an acrylic plastic (for example, plexiglass) can be used with the part that is formed by diamond rotation. For production parts, acrylic or polycarbonate can be used, for example, with the part that is formed, for example, by injection molding techniques. The reflective optical surface can be described as a detailed description of Computer Aided Design (CAD) or a NURBS (Non-Uniform Rational B-Spline) surface, which can be converted into a CAD description. With a CAD file, you can allow the device to be formed using 3-D printing, where the CAD description results in a 3D object directly, without requiring machining.
[0100] [00100] The mathematical techniques discussed above can be coded in various programming environments and / or programming languages, now known or subsequently developed. A currently preferred programming environment is the Java language that runs on the Eclipse Programmer's interface. Other programming environments, such as Microsoft Visual C #, can also be used, if desired. Calculations can also be performed using the PCT platform from Needham, Massachusetts, and / or the Matlab platform from MathWorks, Inc., Natick, Massachusetts. The resulting programs can be stored on a hard drive, a memory card, a CD or a similar device. Procedures can be performed using desktop computing equipment available from a variety of vendors, for example, DELL, HP, TOSHIBA, etc. Alternatively, more powerful computing equipment can be used including "cloud" computing, if desired.
[0101] [00101] A variety of modifications that do not deviate from the scope and spirit of the invention will be evident to those skilled in the art from the previous description. For example, although the reflective optical surfaces that provide the user with a large field of view, for example, a field of view greater than or equal to 100 °, 150 ° or 200 °, constitute an advantageous embodiment of the design aspects of the invention, the computer-based methods and systems for designing the reflective optical surfaces described here can also be used to create surfaces with smaller fields of view. The following claims are intended to cover these and other modifications, variations and equivalents of the specific embodiments explained here.
权利要求:
Claims (8)
[0001]
Head mounted display device (100) characterized by the fact that it comprises: (I) a frame (107) adapted to be mounted on a user's head: (II) an image display system (110) supported by the frame (107); and (III) a reflective optical surface (120) with an ultra-wide angle and free space supported by the frame (107), the reflective optical surface (120) being a continuous surface that is not rotationally symmetrical about any axis of coordinates of a three-dimensional Cartesian coordinate system; on what (a) the image display system includes at least one light emitting surface (81); (b) during use, the reflective optical surface produces virtual images spatially separated from portions spatially separate from at least one light-emitting surface to perform non-pupil formation, in focus for viewing close to the eye of spatially separated reflected virtual images , at least one of the spatially separated virtual images being angularly separated from at least one other of the virtual images spatially separated by at least 100 degrees, said angular separation being measured from the center of rotation of a nominal user's eye; and (c) where the angular separation is about 200 degrees.
[0002]
Head-mounted visualization device according to claim 1, characterized by the fact that: at least one of the spatially separated virtual images is angularly separated from at least one other of the virtual images spatially separated by at least 150 degrees; and at least one point on the reflective optical surface (120) is angularly separated from at least another point on the reflective optical surface by at least 150 degrees.
[0003]
Head-mounted visualization device according to claim 1, characterized by the fact that, during use: at least one of the spatially separated virtual images is located along a direction of the eye that passes through at least one point on the reflective optical surface (120); and at least one other of the spatially separated virtual images is located along a direction of the eye that passes through at least one other point on the reflective optical surface (120).
[0004]
Head-mounted visualization device according to claim 1, characterized by the fact that the reflective optical surface (120) is semitransparent.
[0005]
Head-mounted display device according to claim 1, characterized by the fact that the device has one and only one reflective optical surface (120).
[0006]
Head-mounted display device according to claim 1, characterized by the fact that the device has two and only two reflective optical surfaces (120), one for each of the user's eyes.
[0007]
Head-mounted visualization apparatus according to claim 1, characterized by the fact that the image visualization system (110) is in a fixed relationship with the free space optical surface (120) and further comprises a visualization system additional image and a reflective optical surface in fixed relation with the additional display device.
[0008]
Head-mounted display apparatus according to any one of the preceding claims, characterized in that the reflective optical surface (120) of ultra-wide angle and free space is configured to at least partially collimate the light emitted from at least a light-emitting surface (81) of the image display system.
类似技术:
公开号 | 公开日 | 专利标题
BR112013009826B1|2021-01-26|head-mounted visualization device that achieves one or more reflective optical surfaces
US10495790B2|2019-12-03|Head-mounted display apparatus employing one or more Fresnel lenses
US9720228B2|2017-08-01|Collimating display with pixel lenses
CA2815461C|2019-04-30|Head-mounted display apparatus employing one or more fresnel lenses
KR101924820B1|2018-12-04|Methods and systems for creating free space reflective optical surfaces
KR101928764B1|2018-12-13|Head-mounted display apparatus employing one or more reflective optical surfaces
TWI553344B|2016-10-11|Head-mounted display apparatus employing one or more fresnel lenses
TWI559034B|2016-11-21|Head-mounted display apparatus employing one or more reflective optical surfaces
US10088692B1|2018-10-02|Free-space lens design method
AU2015249168B2|2016-11-17|Collimating display with pixel lenses
BR112013009855B1|2021-11-09|HEAD MOUNTED DISPLAY DEVICE
同族专利:
公开号 | 公开日
EP2630540A2|2013-08-28|
MX2013004455A|2016-02-29|
US20120120499A1|2012-05-17|
JP6262530B2|2018-01-17|
AU2011319479B2|2015-10-29|
JP2014500518A|2014-01-09|
BR112013009826A2|2016-07-26|
AU2011319479A1|2013-06-13|
CA2815447A1|2012-04-26|
WO2012052979A2|2012-04-26|
CA2815447C|2019-02-12|
WO2012052979A3|2012-07-12|
US8625200B2|2014-01-07|
MX349190B|2017-07-18|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US5982343A|1903-11-29|1999-11-09|Olympus Optical Co., Ltd.|Visual display apparatus|
US3880509A|1974-03-15|1975-04-29|Us Navy|Wide-angle on-axis projection system|
US4026641A|1975-12-30|1977-05-31|The United States Of America As Represented By The Secretary Of The Army|Toric reflector display|
US5278532A|1987-09-14|1994-01-11|Hughes Aircraft Company|Automotive instrument virtual image display|
AT180578T|1992-03-13|1999-06-15|Kopin Corp|DISPLAY DEVICE ON THE HEAD|
JPH06509185A|1991-07-03|1994-10-13|
DE69221987T2|1991-11-01|1998-02-05|Sega Enterprises Kk|Imaging device attached to the head|
US5325386A|1992-04-21|1994-06-28|Bandgap Technology Corporation|Vertical-cavity surface emitting laser assay display system|
US5572343A|1992-05-26|1996-11-05|Olympus Optical Co., Ltd.|Visual display having see-through function and stacked liquid crystal shutters of opposite viewing angle directions|
US5561538A|1992-11-17|1996-10-01|Sharp Kabushiki Kaisha|Direct-view display apparatus|
US5309169A|1993-02-01|1994-05-03|Honeywell Inc.|Visor display with fiber optic faceplate correction|
US5347400A|1993-05-06|1994-09-13|Ken Hunter|Optical system for virtual reality helmet|
US7310072B2|1993-10-22|2007-12-18|Kopin Corporation|Portable communication display device|
US6160666A|1994-02-07|2000-12-12|I-O Display Systems Llc|Personal visual display system|
US5483307A|1994-09-29|1996-01-09|Texas Instruments, Inc.|Wide field of view head-mounted display|
US5581271A|1994-12-05|1996-12-03|Hughes Aircraft Company|Head mounted visual display|
TW275590B|1994-12-09|1996-05-11|Sega Enterprises Kk|Head mounted display and system for use therefor|
JP3672951B2|1994-12-13|2005-07-20|オリンパス株式会社|Image display device|
JP3599828B2|1995-05-18|2004-12-08|オリンパス株式会社|Optical device|
JPH09113842A|1995-10-17|1997-05-02|Olympus Optical Co Ltd|Head or face mount type video display device|
JPH09219832A|1996-02-13|1997-08-19|Olympus Optical Co Ltd|Image display|
US5701132A|1996-03-29|1997-12-23|University Of Washington|Virtual retinal display with expanded exit pupil|
US5834676A|1996-08-12|1998-11-10|Sight Unseen|Weapon-mounted location-monitoring apparatus|
US6215593B1|1996-11-13|2001-04-10|Ian A. Bruce|Portable wide-field optical system with microlenses and fiber-optic image transfer element|
JPH10170859A|1996-12-09|1998-06-26|Canon Inc|Image display device|
JP3716625B2|1997-09-18|2005-11-16|コニカミノルタホールディングス株式会社|Video observation apparatus, camera, and video observation system|
US6140979A|1998-08-05|2000-10-31|Microvision, Inc.|Scanned display with pinch, timing, and distortion correction|
JP2000199853A|1998-10-26|2000-07-18|Olympus Optical Co Ltd|Image-formation optical system and observation optical system|
US7324081B2|1999-03-02|2008-01-29|Siemens Aktiengesellschaft|Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus|
US7158096B1|1999-06-21|2007-01-02|The Microoptical Corporation|Compact, head-mountable display device with suspended eyepiece assembly|
US6445362B1|1999-08-05|2002-09-03|Microvision, Inc.|Scanned display with variation compensation|
JP4573393B2|2000-01-06|2010-11-04|オリンパス株式会社|Image display device|
US20010033401A1|2000-03-17|2001-10-25|Minolta Co., Ltd.|Information display device|
US6813085B2|2000-06-26|2004-11-02|Angus Duncan Richards|Virtual reality display device|
US20020094189A1|2000-07-26|2002-07-18|Nassir Navab|Method and system for E-commerce video editing|
KR20020025301A|2000-09-28|2002-04-04|오길록|Apparatus and Method for Furnishing Augmented-Reality Graphic using Panoramic Image with Supporting Multiuser|
JP3406965B2|2000-11-24|2003-05-19|キヤノン株式会社|Mixed reality presentation device and control method thereof|
US6919866B2|2001-02-06|2005-07-19|International Business Machines Corporation|Vehicular navigation system|
JP2002258208A|2001-03-01|2002-09-11|Mixed Reality Systems Laboratory Inc|Optical element and composite display device utilizing it|
JP2002287077A|2001-03-23|2002-10-03|Nikon Corp|Video display device|
US6771423B2|2001-05-07|2004-08-03|Richard Geist|Head-mounted virtual display apparatus with a near-eye light deflecting element in the peripheral field of view|
US6731434B1|2001-05-23|2004-05-04|University Of Central Florida|Compact lens assembly for the teleportal augmented reality system|
US7009773B2|2001-05-23|2006-03-07|Research Foundation Of The University Of Central Florida, Inc.|Compact microlenslet arrays imager|
US20020186179A1|2001-06-07|2002-12-12|Knowles Gary R.|Optical display device|
US6522474B2|2001-06-11|2003-02-18|Eastman Kodak Company|Head-mounted optical apparatus for stereoscopic display|
DE10131720B4|2001-06-30|2017-02-23|Robert Bosch Gmbh|Head-Up Display System and Procedures|
JP4751534B2|2001-07-24|2011-08-17|大日本印刷株式会社|Optical system and apparatus using the same|
US20070132785A1|2005-03-29|2007-06-14|Ebersole John F Jr|Platform for immersive gaming|
GB2387920B|2002-04-24|2005-11-23|Seos Ltd|An eyepiece for viewing a flat image and comprising a cemented doublet of reflecting and refracting optical components|
US7002551B2|2002-09-25|2006-02-21|Hrl Laboratories, Llc|Optical see-through augmented reality modified-scale display|
KR100484174B1|2002-11-06|2005-04-18|삼성전자주식회사|Head mounted display|
US20040130783A1|2002-12-02|2004-07-08|Solomon Dennis J|Visual display with full accommodation|
US7432879B2|2003-02-10|2008-10-07|Schonlau William J|Personal viewer|
US7119965B1|2003-02-24|2006-10-10|University Of Central Florida Research Foundation, Inc.|Head mounted projection display with a wide field of view|
US7063256B2|2003-03-04|2006-06-20|United Parcel Service Of America|Item tracking and processing systems and methods|
JP4532856B2|2003-07-08|2010-08-25|キヤノン株式会社|Position and orientation measurement method and apparatus|
CA2449982A1|2003-07-16|2005-01-16|Aurora Digital Advertising Inc.|Three dimensional display method, system and apparatus|
ITTO20030640A1|2003-08-19|2005-02-20|Luigi Giubbolini|MAN INTERFACE SYSTEM - MACHINE USING|
ITTO20030662A1|2003-08-29|2005-02-28|Fiat Ricerche|VIRTUAL VISUALIZATION ARRANGEMENT FOR A FRAMEWORK|
IL157837A|2003-09-10|2012-12-31|Yaakov Amitai|Substrate-guided optical device particularly for three-dimensional displays|
JP4364002B2|2004-02-06|2009-11-11|オリンパス株式会社|Head-mounted camera and photographing method using head-mounted camera|
AU2005269256B2|2004-08-03|2008-08-07|Silverbrook Research Pty Ltd|Head mounted display with wave front modulator|
WO2006023647A1|2004-08-18|2006-03-02|Sarnoff Corporation|Systeme and method for monitoring training environment|
US7545571B2|2004-09-08|2009-06-09|Concurrent Technologies Corporation|Wearable display system|
JP2006091477A|2004-09-24|2006-04-06|Konica Minolta Holdings Inc|Wide-angle observation optical system equipped with holographic reflection surface|
US7619825B1|2004-09-27|2009-11-17|Rockwell Collins, Inc.|Compact head up display with wide viewing angle|
JP4404738B2|2004-10-05|2010-01-27|矢崎総業株式会社|Head-up display device|
US20060103590A1|2004-10-21|2006-05-18|Avner Divon|Augmented display system and methods|
NZ537849A|2005-01-21|2007-09-28|Peter James Hilton|Direct Retinal Display projecting a scanned optical beam via diverging and converging reflectors|
US7812815B2|2005-01-25|2010-10-12|The Broad of Trustees of the University of Illinois|Compact haptic and augmented virtual reality system|
EP1846796A1|2005-02-10|2007-10-24|Lumus Ltd|Substrate-guided optical device particularly for vision enhanced optical systems|
US8040361B2|2005-04-11|2011-10-18|Systems Technology, Inc.|Systems and methods for combining virtual and real-time physical environments|
US7151639B2|2005-05-11|2006-12-19|Everspring Industry, Co., Ltd.|Thin-type spherical lens|
US20060281061A1|2005-06-13|2006-12-14|Tgds, Inc.|Sports Training Simulation System and Associated Methods|
US7732694B2|2006-02-03|2010-06-08|Outland Research, Llc|Portable music player with synchronized transmissive visual overlays|
US7499217B2|2006-03-03|2009-03-03|University Of Central Florida Research Foundation, Inc.|Imaging systems for eyeglass-based display devices|
IL174170A|2006-03-08|2015-02-26|Abraham Aharoni|Device and method for binocular alignment|
CN100462984C|2006-03-17|2009-02-18|清华大学|Freeform curved surface reflector design system and method thereof|
US20070243916A1|2006-04-14|2007-10-18|Lee Ren E|Objective oriented reality horror survival game|
SE0601216L|2006-05-31|2007-12-01|Abb Technology Ltd|Virtual workplace|
KR100809479B1|2006-07-27|2008-03-03|한국전자통신연구원|Face mounted display apparatus and method for mixed reality environment|
JP4835327B2|2006-08-30|2011-12-14|コニカミノルタホールディングス株式会社|Image display device and head-mounted image display device|
US8259239B2|2007-01-18|2012-09-04|The Arizona Board Of Regents On Behalf Of The University Of Arizona|Polarized head-mounted projection display|
US7762683B2|2007-02-23|2010-07-27|Raytheon Company|Optical device with tilt and power microlenses|
DE102007009828A1|2007-02-28|2008-09-04|Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V.|Image recording device for use in vacuum gripper, for use in robot with robot arm, has superimposing device, visibility aiding device for generating image signals, where video spectacle has transparent visual elements|
IL183637A|2007-06-04|2013-06-27|Zvi Lapidot|Distributed head-mounted display|
US20080309586A1|2007-06-13|2008-12-18|Anthony Vitale|Viewing System for Augmented Reality Head Mounted Display|
US20090002574A1|2007-06-29|2009-01-01|Samsung Electronics Co., Ltd.|Method and a system for optical design and an imaging device using an optical element with optical aberrations|
JP5216761B2|2007-09-26|2013-06-19|パナソニック株式会社|Beam scanning display device|
WO2009094643A2|2008-01-26|2009-07-30|Deering Michael F|Systems using eye mounted displays|
US7928927B1|2008-03-17|2011-04-19|Rockwell Collins, Inc.|Head worn head up display system|
US20100149073A1|2008-11-02|2010-06-17|David Chaum|Near to Eye Display System and Appliance|
JP2010020065A|2008-07-10|2010-01-28|Olympus Corp|Image display apparatus|
GB2461907A|2008-07-17|2010-01-20|Sharp Kk|Angularly restricted display|
US9480919B2|2008-10-24|2016-11-01|Excalibur Ip, Llc|Reconfiguring reality using a reality overlay device|
US9600067B2|2008-10-27|2017-03-21|Sri International|System and method for generating a mixed reality environment|
US20100238161A1|2009-03-19|2010-09-23|Kenneth Varga|Computer-aided system for 360º heads up display of safety/mission critical data|
US8059342B2|2009-04-03|2011-11-15|Vuzix Corporation|Beam segmentor for enlarging viewing aperture of microdisplay|
WO2010123934A1|2009-04-20|2010-10-28|The Arizona Board Of Regents On Behalf Of The University Of Arizona|Optical see-through free-form head-mounted display|
JP5402293B2|2009-06-22|2014-01-29|ソニー株式会社|Head-mounted display and image display method in head-mounted display|
GB2474007A|2009-08-27|2011-04-06|Simon R Daniel|Communication in and monitoring of a disaster area, optionally including a disaster medical pack|
JP2011059444A|2009-09-10|2011-03-24|Olympus Corp|Spectacles-type image display device|
JP2011133633A|2009-12-24|2011-07-07|Olympus Corp|Visual display device|
US20110214082A1|2010-02-28|2011-09-01|Osterhout Group, Inc.|Projection triggering through an external marker in an augmented reality eyepiece|
US20110213664A1|2010-02-28|2011-09-01|Osterhout Group, Inc.|Local advertising content on an interactive head-mounted eyepiece|
AU2011220382A1|2010-02-28|2012-10-18|Microsoft Corporation|Local advertising content on an interactive head-mounted eyepiece|
GB2478738A|2010-03-16|2011-09-21|Belron Hungary Kft Zug Branch|Eye level display in communication with electronic device|
US20110250962A1|2010-04-09|2011-10-13|Feiner Steven K|System and method for a 3d computer game with true vectorof gravity|
BR112013014975A2|2010-12-16|2020-08-11|Lockheed Martin Corporation|collimation display with pixel lenses|
CA2750287C|2011-08-29|2012-07-03|Microsoft Corporation|Gaze detection in a see-through, near-eye, mixed reality display|US10908421B2|2006-11-02|2021-02-02|RazerPte. Ltd.|Systems and methods for personal viewing devices|
US10359545B2|2010-10-21|2019-07-23|Lockheed Martin Corporation|Fresnel lens with reduced draft facet visibility|
US9632315B2|2010-10-21|2017-04-25|Lockheed Martin Corporation|Head-mounted display apparatus employing one or more fresnel lenses|
US8781794B2|2010-10-21|2014-07-15|Lockheed Martin Corporation|Methods and systems for creating free space reflective optical surfaces|
BR112013014975A2|2010-12-16|2020-08-11|Lockheed Martin Corporation|collimation display with pixel lenses|
US8885882B1|2011-07-14|2014-11-11|The Research Foundation For The State University Of New York|Real time eye tracking for human computer interaction|
US9285871B2|2011-09-30|2016-03-15|Microsoft Technology Licensing, Llc|Personal audio/visual system for providing an adaptable augmented reality environment|
US9345957B2|2011-09-30|2016-05-24|Microsoft Technology Licensing, Llc|Enhancing a sport using an augmented reality display|
US9286711B2|2011-09-30|2016-03-15|Microsoft Technology Licensing, Llc|Representing a location at a previous time period using an augmented reality display|
CN104216118A|2013-06-03|2014-12-17|约翰·T·默里|Head Mounted Display With Remote Control|
US9459457B2|2011-12-01|2016-10-04|Seebright Inc.|Head mounted display with remote control|
US9625720B2|2012-01-24|2017-04-18|Accipiter Radar Technologies Inc.|Personal electronic target vision system, device and method|
JP5891125B2|2012-06-29|2016-03-22|株式会社ソニー・コンピュータエンタテインメント|Video processing apparatus, video processing method, and video processing system|
US9310611B2|2012-09-18|2016-04-12|Qualcomm Incorporated|Methods and systems for making the use of head-mounted displays less obvious to non-users|
US9470893B2|2012-10-11|2016-10-18|Sony Computer Entertainment Europe Limited|Head mountable device|
WO2014141918A1|2013-03-11|2014-09-18|コニカミノルタ株式会社|Wearable computer|
US20140333773A1|2013-05-11|2014-11-13|Randy James Davis|Portable audio/ video mask|
US9672649B2|2013-11-04|2017-06-06|At&T Intellectual Property I, Lp|System and method for enabling mirror video chat using a wearable display device|
US9696552B1|2014-01-10|2017-07-04|Lockheed Martin Corporation|System and method for providing an augmented reality lightweight clip-on wearable device|
US9677840B2|2014-03-14|2017-06-13|Lineweight Llc|Augmented reality simulator|
JP6582536B2|2014-06-02|2019-10-02|株式会社ニデック|Target presentation device|
CN104102412B|2014-07-24|2017-12-12|央数文化股份有限公司|A kind of hand-held reading device and method thereof based on augmented reality|
US9508195B2|2014-09-03|2016-11-29|Microsoft Technology Licensing, Llc|Management of content in a 3D holographic environment|
US9599822B2|2014-09-08|2017-03-21|Oculus Vr, Llc|Corrective optics for reducing fixed pattern noise in a virtual reality headset|
US10252460B1|2014-09-15|2019-04-09|Lockheed Martin Corporation|Method of forming an optical element using an additive manufactured component, and related devices, components, and systems|
US9773350B1|2014-09-16|2017-09-26|SilVR Thread, Inc.|Systems and methods for greater than 360 degree capture for virtual reality|
US10684476B2|2014-10-17|2020-06-16|Lockheed Martin Corporation|Head-wearable ultra-wide field of view display device|
US9581819B1|2014-12-17|2017-02-28|Lockheed Martin Corporation|See-through augmented reality system|
US9563270B2|2014-12-26|2017-02-07|Microsoft Technology Licensing, Llc|Head-based targeting with pitch amplification|
KR20170104604A|2015-01-21|2017-09-15|테세랜드 엘엘씨|Display with total internal reflection|
US9939650B2|2015-03-02|2018-04-10|Lockheed Martin Corporation|Wearable display system|
WO2016154026A2|2015-03-20|2016-09-29|Castar, Inc.|Retroreflective light field display|
US10338384B2|2015-10-12|2019-07-02|North Inc.|Spatially separated exit pupils in a head mounted display|
US10754156B2|2015-10-20|2020-08-25|Lockheed Martin Corporation|Multiple-eye, single-display, ultrawide-field-of-view optical see-through augmented reality system|
US10261319B2|2015-11-03|2019-04-16|Google Llc|Display of binocular overlapping images in a head mounted display|
CN105867606A|2015-12-15|2016-08-17|乐视致新电子科技(天津)有限公司|Image acquisition method and apparatus in virtual reality helmet, and virtual reality helmet|
US10713760B2|2015-12-31|2020-07-14|Thomson Licensing|Configuration for rendering virtual reality with an adaptive focal plane|
US10133345B2|2016-03-22|2018-11-20|Microsoft Technology Licensing, Llc|Virtual-reality navigation|
US9995936B1|2016-04-29|2018-06-12|Lockheed Martin Corporation|Augmented reality systems having a virtual image overlaying an infrared portion of a live scene|
US10159900B2|2017-03-17|2018-12-25|Roblox Corporation|Avatar teleport controller|
TWI622806B|2017-04-11|2018-05-01|宏碁股份有限公司|Virtual reality display apparatus|
WO2019055987A1|2017-09-18|2019-03-21|Mira Labs, Inc.|Bumper system|
TWI644126B|2017-11-28|2018-12-11|沅聖科技股份有限公司|Head mounted display and image device|
KR20190101814A|2018-02-23|2019-09-02|삼성전자주식회사|An electronic device including a semi-transparent member disposed at an angle specified with respect to a direction in which a video is outputbelow the video outputmodule|
US11249310B1|2018-11-26|2022-02-15|Lockheed Martin Corporation|Augmented reality device with external light control layer for realtime contrast control|
US11067802B1|2019-01-15|2021-07-20|Lockheed Martin Corporation|Full human field of viewwrap around head-mounted display apparatus with a convex display device|
法律状态:
2018-12-18| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2019-10-22| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-11-03| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2021-01-26| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 20/12/2011, OBSERVADAS AS CONDICOES LEGAIS. |
2021-10-13| B21F| Lapse acc. art. 78, item iv - on non-payment of the annual fees in time|Free format text: REFERENTE A 10A ANUIDADE. |
2022-02-15| B24D| Patent annual fee: restoration after fee payment|
优先权:
申请号 | 申请日 | 专利标题
US40544010P| true| 2010-10-21|2010-10-21|
US61/405,440|2010-10-21|
US41732610P| true| 2010-11-26|2010-11-26|
US41732710P| true| 2010-11-26|2010-11-26|
US41732810P| true| 2010-11-26|2010-11-26|
US41732510P| true| 2010-11-26|2010-11-26|
US61/417,325|2010-11-26|
US61/417,328|2010-11-26|
US61/417,326|2010-11-26|
US61/417,327|2010-11-26|
US201061427530P| true| 2010-12-28|2010-12-28|
US61/427,530|2010-12-28|
US13/211,372|2011-08-17|
US13/211,372|US8625200B2|2010-10-21|2011-08-17|Head-mounted display apparatus employing one or more reflective optical surfaces|
PCT/IB2011/055820|WO2012052979A2|2010-10-21|2011-12-20|Head-mounted display apparatus employing one or more reflective optical surfaces|
[返回顶部]