![]() projection capture systems, interactive projection capture system and projection capture method
专利摘要:
PROJECTIONS CAPTURE SYSTEM, INTERACTIVE PROJECT CAPTURE SYSTEM AND PROJECTIONS METHOD. In one example, a projection capture system includes: a digital camera, a projector and a mirror, housed together, as a single unit, in which, when the unit is developed for use with a work surface: the camera is positioned above the projector; the projector is positioned below the camera; and the mirror is positioned above the projector and configured to reflect light, from the projector, in the camera's capture area. In one example, the method of capturing projections includes: establishing a camera capture area, within which the camera can capture an image of an object; establishment of a projector display area, overlapping the capture area, and in which a projector can project light; lighting the camera capture area with the projector; and positioning a specular glare point from the projector, illuminating the outside of the camera's capture area. 公开号:BR112014002463B1 申请号:R112014002463-4 申请日:2011-08-02 公开日:2020-12-08 发明作者:David Bradley Short 申请人:Hewlett-Packard Development Company, L.P; IPC主号:
专利说明:
Technical field [001] A new projection capture system was developed, in an effort to improve the digital capture of images of documents and other objects, and in an effort to enhance the interactive user experience when working with real objects and projected objects, in a physical work surface. Description of the drawings [002] Figures 1A and 1B are external visualizations in perspective that illustrate a new projection capture system, according to an example of the invention. In figure 1 A, the image of a two-dimensional object (a printed photograph) was captured and displayed. In figure 1B, the image of a three-dimensional object (a cube) was captured and displayed; [003] Figure 2 is an internal perspective view that illustrates a projection capture system, like the system in figure 1, according to an example of the invention; [004] Figure 3 is a block diagram of the projection capture system shown in figure 2; [005] Figure 4 is a block diagram that illustrates an example of a user input device, in the system shown in figures 2 and 3; [006] Figures 5 and 6 are side and front elevation views, illustrating, respectively, the positioning of the camera and the projector in the projection capture system shown in figures 2 and 3; [007] Figures 7-11 are a progression of the side views in elevation, showing different positions for the projector and the camera, in a projection capture system, illustrating some of the problems associated with moving the glare point out of the area capture camera; [008] Figures 12 and 13 illustrate an example of the projection capture system camera shown in figures 2 and 3; [009] Figure 14 illustrates an example of the projection capture system projector shown in figures 2 and 3; and [0010] Figures 15 and 16 illustrate examples of the user input device in the projection capture system shown in figures 2 and 3. [0011] The same part identification numbers are used to designate the same or similar parts, throughout the figures. Detailed Description [0012] The examples shown in the figures and described below illustrate, but do not limit, the invention, which is defined in the Claims which follow the present Description. [0013] In an example of the new projection capture system, a digital camera, a projector and a mirror are housed together, as a single unit, where, when the unit is used for use with a work surface, the camera is positioned above the projector, the projector is positioned below the camera, and the mirror is positioned above the projector and configured to reflect the projector's light in the camera's capture area. In one example, the projector provides a light source for the camera to capture images, where the camera, the projector and the mirror are positioned relative to each other, so that the projector's glare point is outside the area capture camera. [0014] Figures 1A and 1B are external views in perspective of a new projection capture system 10 and an interactive space 12 associated with system 10. Figure 2 is a perspective view that illustrates an example of a video capture system projections 10 with the outer housing 13 removed. Figure 3 is a block diagram of system 10 shown in figure 2. With reference to figures 1A, 1B, 2 and 3, the projection capture system 10 includes a digital camera 14, a projector 16 and a controller 18. A camera 14 and projector 16 are operationally connected to controller 18, so that camera 14 captures an image of an object 20 in the work area 12 and the projector 16 projects the image of the object 22 in the work area 12 and, in some examples , so that the camera 14 captures an image of the projected object image 22. The bottom part of the housing 13 includes a transparent window 21 over the projector 16 (and the infrared camera 30). [0015] In the example shown in figure 1A, a two-dimensional object 20 (a printed photograph) positioned on the work surface 24 of the work area 12 was photographed by camera 14 (figure 2), with the object 20 removed to the side of the area work 12, and the image of the object 22 projected onto the work surface 24, where it can be photographed by camera 14 (figure 2) and / or otherwise manipulated by the user. In the example shown in figure 1B, a three-dimensional object 20 (a cube) positioned on the work surface 24 was photographed by camera 14 (figure 2), the object 20 was removed to the side of the work area 12, and the image of the object 22 was designed in the work area 12, where it can be photographed by the camera 12 and / or otherwise manipulated by the user. [0016] System 10 also includes a user input device 26 that allows the user to interact with system 10. The user can interact with object 20 and / or the image of object 22 on work area 12 through from the input device 26, the image of the object 22 can be transmitted to other work areas 12 of remote systems 10 (not shown) for collaborative user interaction and, if desired, the image of the object 22 can be photographed by camera 14 and redesigned on local and / or remote desktops 12 for subsequent user interaction. In figure 1A, the work surface 24 is part of the desktop or other underlying support structure 23. In figure 1B, the work surface 24 sits on a portable [mat] screen 25 that can include touch sensitive areas. In figure 1A, for example, a user control panel 27 is projected onto the work surface 24, while in figure 1B the control panel 27 can be incorporated into a touch sensitive area of the screen 25. Similarly, an area positioning documents size A4, letter or other standard size 29 can be projected on the work surface 24 of figure 1A or printed on screen 25 of figure 1B. Of course, other configurations of the work surface 24 are possible. For example, it may be desirable in some applications of the system 10 to otherwise use a blank canvas 25 to control the color texture, or other characteristics of the work surface 24, and thus the control panel 27 and the document positioning area 29 can be projected on the blank screen 25 of figure 1B, in the same way as they are projected on desktop 23 of figure 1A. [0017] In the example shown in figure 4, user input device 26 includes a digital infrared pen 28 and an infrared camera 30 to detect pen 28 in the work area 12. Although any suitable user input device can be used , the digital pen has the advantage of allowing entry in three dimensions, including along the work surface 24, without the need for a touch pad or other special surface. In this way, system 10 can be used on a wide variety of work surfaces 24. In addition, the generally horizontal orientation of the work surface 24 makes it useful for many common tasks. The ability to use traditional writing instruments on the work surface 24 is advantageous over vertical or mobile computing interfaces. The projection of an interactive monitor onto the desktop mixes computer tasks with standard objects that may exist on a real desktop, so physical objects can coexist with projected objects. In this way, the comfort of using real writing instruments, as well as their digital equivalents (like the pen 28) becomes a model of effective use. A three-dimensional digital pen that does not require a pad allows notes to be made on top of or close to physical objects, without the need for a touch-sensitive pad that interferes with the use of traditional instruments on the work surface 24. [0018] In an exemplary implementation of system 10, projector 16 serves as a light source for camera 14. The capture area of camera 32 (figure 12) and the display area of projector 34 (figure 14) overlap to the work surface 24. In this way, substantial operational efficiency can be gained by using the projector 16 for both image projection and camera illumination. The light path of the projector 16, along the work area 12 and up to the work surface 24 must be positioned in relation to the camera 14, in order to allow interaction between the user and the monitor, with minimal occlusion of shadow, while avoiding specular glare outside the work surface 24 and objects in the work area 12 that may otherwise blind the camera 14. The system configuration described below prevents glare-induced artifacts that would result from a lighting geometry conventional camera, while still maintaining a sufficiently sharp angle of incidence for the desired projector light path for adequate lighting and projection of two- and three-dimensional objects in the work area 12. [0019] Ideally, the projector 16 can be mounted directly on the work area 12 and an infinite height above the work surface 24, in order to ensure parallel rays of light. This configuration, of course, is not realistic. Even if the projector 16 were moved to a realistic height above the work surface 24 (but still pointing directly down), the light from the projector would be reflected from shiny, semi-shiny surfaces and objects, straight back to the camera 14, creating a specular glare. Thus, the glare point needs to be moved out of the camera's capture area 32. (Specular glare refers to glare from specular reflection, where the angle of incidence of the incident light ray and the angle of reflection of the ray of reflected light are equal and the incident, reflected and normal directions are coplanar.) [0020] In order to obtain a commercially reasonable solution to this problem of specular glare, camera 14 and projector 16 are removed from the capture center and display areas 32, 34 and projector 16 is positioned below, close to base 36, as shown in figures 5 and 6, and a folding mirror 38 is inserted into the projector's light path to simulate a high projector position, above the work surface 24. The simulated position of the projector 16 and the path of the projector corresponding light above mirror 38 are shown in phantom lines, in figures 5 and 6. However, before discovering the configuration shown in figures 5 and 6 in more detail, it will be useful to consider the problems associated with other possible configurations that move the point blurring out of the camera's capture area 32. [0021] In figure 7, camera 14 is positioned in the center of capture area 32 with an overhead projector 16 slightly decentralized, so that camera 14 does not block the light path of the projector. In the configuration of figure 7, the specular glare point 39 (at the intersection of the incident light ray 41 and the reflected light ray 43) falls within the capture area 32 and, in this way, will blind the camera 14 with respect to some objects and images, in the capture area 32. In addition, for the configuration shown in figure 7, where the camera 14 and the projector 16 are both positioned at the top, above the base, the system 10 may become unbalanced and, thus , undesirable for implementing a commercial product. If the projector 16 is positioned laterally, the distance required to move the glare point 39 to the shape of the camera capture area 32, as shown in figure 8, the corresponding displacement of the projector lens will not be feasible. In addition, any product implementation for system configuration 10 shown in figure 8 will be undesirably large and heavy at the top. [0022] Moving the camera 14 to a decentralized point on the capture area 32 allows for a smaller projector system 16, as shown in figure 9, but the decentralization of the projector lens will still be too big and the product will still be unbalanced. In the configuration shown in figure 10, the projector 16 is raised to a height where it can be brought close enough for an acceptable lens decentralization, but, of course, the product is now very high and unbalanced. The desirable solution is a “folded” light path for the projector 16, shown in figures 5 and 11, in which the “tall and lean” configuration of figure 10 is simulated by means of a folded mirror 38. In figures 5 and 11 , the projector 16 and the upper light path are bent over the reflecting surface of the mirror 38, to project the same light path onto the work surface 24, as in the configuration in figure 10. This folding effect is best seen in the figure 5, where fold angles 01 = 02 and ®1 = ®2. [0023] As shown in figures 5 and 6, the camera 14 is positioned in front of the mirror 38 on the work area 12, so that it does not block the light path of the projector. The camera 14 is positioned off-center, in the Y direction (figure 5), as part of the overall geometry, to keep the glare point 39 out of the capture area 32 with an acceptable deviation for both the camera 14 and the projector 16. The projector 16 is focused on mirror 38, so that the light from projector 16 is reflected from mirror 38 in the work area 12. By moving projector 16 down and introducing a folded mirror 38 into the projector's light path , the glare point 39 is maintained outside the capture area 32, with an acceptable projector offset and the system 10 will be sufficiently narrow, short and stable (not unbalanced) to support a commercially attractive product implementation. [0024] Thus, and referring again to figures 1A, 1B and 2, the system components 10 can be housed together, as a single device 40. Also with reference to figure 3, to assist the implementation system 10 as a independent integrated device 40, controller 18 may include a processor 42, memory 44 and an input / output device 46 housed together in device 40. For this controller configuration 18, system programming to control and coordinate the functions of the camera 14 and projector 16 may reside substantially in the memory of controller 44, for execution by processor 42, thus allowing an independent device 40 and reducing the need for special programming of camera 14 and projector 16. While other configurations are possible, such as, for example, where controller 18 is wholly or partially formed by a computer or remote server, with respect to camera 14 and projector 16, an independent apparatus compact, as the device 40 shown in figures 1 A, 1B and 2 offers full functionality for the user, in an integrated compact mobile device 40. [0025] Now with reference to figure 12, the camera 14 is positioned in front of the mirror 38, above the work area 12, in a decentralized location in relation to the center of the capture area 32. As noted above, this decentralized position of the camera 14 helps prevent specular glare when photographing objects on the desktop 12 without blocking the light path of the projector 16. While camera 14 generically represents any digital camera suitable for selective capture of still images and video images in the work 12, it is expected that a high resolution digital camera will be used in most applications of the system 10. A “high resolution” digital camera, as used in this document, means a camera that has a sensor system of at least 12 megapixels. Lower resolution cameras may be acceptable for some basic scanning and copying functions, but resolutions below 12 megapixels are currently not suitable for generating digital images sufficiently detailed for a full range of manipulation and collaboration functions. Small, high-quality digital cameras, with high-resolution sensors, are now quite common and commercially available from different camera manufacturers. A high-resolution sensor, accompanied by high-performance digital signal processing (DSP) chips, available on many digital cameras, will support sufficiently fast image processing times, such as a shorter click-to-preview time than a second, to provide acceptable performance for most system applications 10. [0026] Now with reference also to figure 13, in the example shown, the camera sensor 50 is oriented in a plane parallel to the plane of the work surface 24 and the light is focused on the sensor 50 through a 52 shift lens. configuration for sensor 50 and lens 52 can be used to optically correct trapezoidal distortions without digital trapezoidal correction of the object image. The field of view of camera 14 defines a three-dimensional capture space 51 in work area 12, within which camera 14 can effectively capture images. Capture space 51 is limited in X and Y dimensions by the capture area of camera 32, on work surface 24. Lens 52 can be optimized for the fixed distance, fixed focus and fixed zoom corresponding to capture space 51. [0027] With reference to figure 14, the projector 16 is positioned close to the base 36, outside the display area of the projector 34 and focused on the mirror 38, so that the light from the projector 16 is reflected from the mirror 38, in the work area 12. Projector 16 and mirror 38 define a three-dimensional display space 53 in work area 12, within which projector 16 can effectively display images. The display space of the projector 53 overlaps the capture space of the camera 51 (figure 12) and is limited in X and Y dimensions by the display area 34 of the work surface 24. As long as the projector 16 generically represents any light projector appropriate, the compact size and energy efficiency of a DLP (digital light processing) LED or laser projector will be desirable for most system applications 10. The projector 16 can also employ a Shift lens to allow for complete correction optical trapezoidal image projection. As noted above, the use of mirror 38 increases the length of the projector's effective light path, simulating an overhead position of the projector 16, while still allowing a commercially reasonable height for an integrated independent device 40. [0028] An example of characteristics suitable for system 10 as an independent device 40 is shown in Table 1. (The dimensional references in Table 1 are for figures 5 and 6.) Table 1 [0029] Since the projector 16 acts as a light source for the camera 12, both for capturing still images and video, the light from the projector needs to be intense enough to override any ambient light that may cause defects due to specular glare. It has been determined that a projector light of 200 lumens or more will be sufficiently intense to overlap with ambient light for the typical desktop application, for system 10 and device 40. For video capture and video collaboration in real time, the projector 16 sends white light to the work area 12, to illuminate the object (s) 20. For LED projectors 16, the sequences of the red, green and blue LEDs that form the white light are synchronized with the rate of video frames from the camera 14. The refresh rate of the projector 16 and each LED sub-frame update period should be an integer of the camera's exposure time, in order to avoid "rainbow banding". rainbow colors] and other unwanted effects on the video image. In addition, the camera's video frame rate should be synchronized with the frequency of any ambient fluorescent light that typically flashes twice the frequency of the AC line (such as, for example, 120Hz for a power line of 60Hz AC). An ambient light sensor can be used to read the frequency of the ambient light and adjust the video frame rate for the camera 14 accordingly. To capture still images, the projector's red, green and blue LEDs can be activated simultaneously, for a camera flash, to increase the intensity of light in the work area 12, helping to overcome ambient light and allowing for faster shutter speeds. and / or smaller openings to reduce image noise. [0030] The exemplary configuration of the system 10 integrated in the independent device 40 shown in the figures and described above provides a desirable balance between product size, performance, usability and cost. The folded light path for the projector 16, reduces the height of the device 40, while maintaining an efficient positioning of the projector, above the working area 12, to prevent specular glare in the camera capture area 12. The light path of the projector glows on the horizontal work surface 24, at an inclined angle, allowing the capture of a 3D image of the object. This combination of a longer light path and an inclined angle minimizes light drop across the capture area, maximizing light uniformity for the camera's flash. In addition, the folded light path allows the positioning of the projector 16 close to the base 36, for product stability. [0031] Input devices and techniques suitable for use in system 10 include, for example, finger touch, touch gestures, pen, air gestures, speech recognition, head tracking and eye tracking. A touchpad can be used to allow a multi-touch interface for navigation in a graphical user interface or to perform actions involving intuitive gestures, such as push [ush], flick [leaf], swipe [slide finger], scroll [roll], pinch-to-zoom, and two-finger- rotate. Deep cameras that use structured light, light time, disturbed light pattern or stereoscopic vision can also be used to allow gestures in the air or limited touch detection and touch gesture, without a touchpad. A touch-free digital pen is especially suitable as user input 26 for system 10. Thus, in the example shown in the figures, user input 26 includes an infrared digital pen 28 and an infrared camera 30 for pen detection 28 on work area 12. As noted above, a touch-free digital pen has the advantage of allowing entries in three dimensions, including along the work surface 24, without a touch pad or other special surface. [0032] Now with reference to figures 4 and 15, the input device 26 includes an infrared pen 28, infrared camera 30 and pen charging dock 54. Pen 28 includes an infrared light 56, a nib switch ] touch sensitive 58 to automatically turn the light 56 on and off, based on touch, and a manual on / off switch 60 to turn the light on and off manually 56. (Tip switch 58 and manual switch 60 are shown in block diagram in figure 4.) Light 56 can be positioned, for example, on the tip of pen 28, as shown in figure 15, to help maintain a clear line of sight between camera 30 and light 56. The light 56 can also emit a visible light to help the user determine whether the light is on or off. [0033] The tip switch 58 can be touch sensitive, up to approximately 2 g of force, for example, to simulate a traditional writing instrument. When the pen tip touches the work surface 24 or another object, the tip switch 58 detects the contact and turns on the light 56. The light 56 is detected by the camera 30, which signals a touch contact event (similar a mouse button click or a touch of a finger on a touchpad). The camera 30 will continue to signal the contact, tracking any movement of the pen 28, as long as the light 56 remains on. The user can slide the pen 28 around any surface, such as a pen, to trace the surface or activate the control functions. When the pen tip stops making contact with an object, the light is turned off and the camera 30 does not signal any contact. Manual light switch 60 can be used to signal an event that does not involve touch. For example, when working in a three-dimensional work area 12, the user may want to modify, alter or otherwise manipulate a projected image above the work surface 24, manually signaling a "virtual" contact event. [0034] The infrared camera 30 and the mirror 38 define the three-dimensional infrared capture space 61 of the work area 12, within which the infrared camera 30 can effectively detect the light, from the pen 28. The capture space 61 is limited in X and Y dimensions by a capture area of the infrared camera 62, on the work surface 24. In the example shown, best seen when comparing figures 14 and 15, the capture space of the infrared camera 61 extends in parallel with the display space of the projector 53. In this way, the infrared camera 30 can capture an activation of the pen anywhere in the display space 53. [0035] In an exemplary implementation shown in figure 16, camera 30 is integrated into the projection light path, so that the projector's field of view and the infrared camera's field of view are coincident, to help ensure that the pen 28 and, therefore, the tracking signal from the infrared camera 30 is properly aligned with the projector monitor, anywhere in the work area 12. Referring to figure 16, the visible light 64 generated by the red, green LEDs and blue 66, 68, and 70 on the projector 16 passes through different optical devices 72 (including a Shift lens 74) along the mirror 38 (figure 14). The infrared light 75 from the pen 28, in the work area 12, reflected from the mirror 38 towards the projector 16 is directed to the sensor 76 of the infrared camera by an infrared ray baffle 78 through a Shift lens 80. (In a similar way In the exemplary configuration of camera 14 described above, the infrared light sensor 78 of camera 30 can be oriented in a plane parallel to the plane of the work surface 24 and of the light focused on sensor 76, through the Shift lens 80 for full trapezoidal correction. ) [0036] It may be desirable, for some commercial implementations, to house the projector 16 and the infrared camera 30 together, in a single housing 82, as shown in figure 16. The geometric configuration of the infrared camera 30 shown in figure 16 helps to ensure that the pen tracking signal is aligned with the display, regardless of the height of the pen 28, above the work surface 24. If the projector's field of view and the infrared camera's field of view do not match, it may be difficult to calibrate the pen tracking at more than one height above the work surface 24, creating the risk of a parallax between the desired pen input position and the resulting displayed position. [0037] Although it is expected that work area 12 will normally include a physical work surface 24 to support an object 20, work space 12 can also be implemented as a fully designed workspace without a physical work surface. In addition, workspace 12 can be implemented as a three-dimensional workspace to operate with two-dimensional and three-dimensional objects, or as a two-dimensional workspace, to operate with only two-dimensional objects. While the configuration of the work area 12 is generally determined in large part by the hardware and programming elements of the system 10, the configuration of the work area 12 can also be affected by the characteristics of a physical work surface 24. Thus, in some examples of system 10 and device 40, it may be appropriate to consider that workspace 12 is part of system 10, in the sense that the virtual workspace accompanies system 10, to be manifested in the area of physical work, when device 36 is operational and, in other examples, it may be appropriate to consider that work area 12 is not part of system 10. [0038] The examples of system 10 shown in the figures, with a camera 14 and a projector 16, do not exclude the use of two or more cameras 14 and / or two or more projectors 16. In fact, it may be desirable, for some applications system 10, include more than one camera, more than one projector or more than one of the other system components. Thus, the articles “one” and “one”, as used in this document, can mean “one or more” or “one or more”. [0039] As noted at the beginning of the present Description, the examples shown in the figures and described above illustrate, but do not limit, the invention. Other examples, configurations and implementations will be possible. Accordingly, the above description should not be construed as limiting the scope of the invention, which is defined in the claims below.
权利要求:
Claims (19) [0001] 1. Projection capture system, for use with a work surface, the system, characterized by the fact that it comprises: a digital camera, a projector and a mirror, an infrared camera and a ray deflector housed together as a single unit, where, when the single unit is used for use with the work surface; the digital camera is positioned above the projector and the digital camera defines the capture area within which the digital camera can acquire images on the work surface; the projector is positioned below the camera; the mirror is positioned above the projector and configured to reflect light from the projector, passing through the ray deflector, down on the work surface, and the projector and mirror define a display area on the work surface overlapping at least part of the catch area; the infrared camera and the projector are housed together, together with the ray deflector, in the single housing, so that the field of view of the projector and the field of view of the infrared camera are coincident; the single housing is positioned below the digital camera and above the work surface; the mirror is outside the single housing; and the infrared light emitted by an input device in said field of view is reflected from the mirror towards the projector, and the reflected infrared light is directed to the infrared camera by the ray deflector and detected by the infrared camera. [0002] 2.System, according to claim 1, characterized by the fact that it also comprises a user input device operationally connected to the controller and configured so as to allow a user to interact with the system, on the work surface, the input device including a pen to selectively emit infrared light. [0003] 3.System, according to claim 1 or 2, characterized by the fact that: the digital camera and the projector are operationally connected to each other, so that the projector provides a light source for the digital camera to capture images; and the digital camera, projector and mirror are positioned, in relation to each other, so that when the single unit is used for use with the work surface, the spot of light from the projector reflected in the mirror is outside the capture area of the camera. [0004] 4.System, according to claim 1 or 2, characterized by the fact that when the unit is used for use with the work surface: the digital camera is positioned over the capture area in a decentralized position, with respect to the center of the area capture; and the projector is positioned outside the capture area and outside the display area. [0005] 5.System according to any one of claims 1 to 4, characterized by the fact that it also comprises a controller housed in the single unit, the controller operationally connecting the digital camera and the projector and configured to control the digital camera and the projector to: capture an image of an object positioned on the work surface in the capture area; and project the image of the object onto the work surface in the display area. [0006] 6. System according to claim 5, characterized by the fact that the single unit is configured to be used with a horizontal and flat work surface. [0007] 7. The system, according to claim 6, characterized by the fact that it also includes a portable panel that can be used with the single unit, such as the work surface. [0008] 8. System, according to claim 1 or 2, characterized by the fact that the display area and the capture area are the same. [0009] 9. Projection capture system, for use with a work surface, the system, characterized by the fact that it comprises: a digital camera, a projector, a controller, a mirror, an infrared camera and a ray deflector housed together as a single unit, in which, when the single unit is used for use with the work surface: the digital camera is positioned above the projector and defines a capture area within which the digital camera can acquire images on the work surface; the mirror is positioned above the projector and configured to reflect light from the projector, passing through the ray deflector downwards on the work surface; the projector and mirror define a display area on the work surface, overlapping at least part of the capture area; the infrared camera and the projector are housed together, together with the ray deflector, in the single housing, so that the field of view of the projector and the field of view of the infrared camera are coincident; the single housing is positioned below the digital camera and above the work surface; the mirror is outside the single housing; and infrared light emitted from an input device in that field of view is reflected from the mirror towards the projector, and the reflected infrared light is directed to the infrared camera by the ray deflector and detected by the infrared camera, to which the controller connects. operationally the digital camera and the projector, and is configured to control the digital camera and the projector to: the projector illuminate the capture area; the digital camera captures the image of an object positioned on the work surface of the capture area and illuminated by the projector; and the projector projects the image of the object onto the work surface in the display area. [0010] 10. The system, according to claim 9, characterized by the fact that it also comprises the user's input device operationally connected to the controller and configured so as to allow a user to interact with the system, on the work surface, the input device including a pen to selectively emit infrared light. [0011] 11. System according to claim 9 or 10, characterized by the fact that the digital camera, the projector, and the mirror are positioned in relation to each other, so that when the single unit is used for use with the surface of At work, the point of glare from the projector's light reflected in the mirror is outside the camera's capture area. [0012] 12. Interactive projection capture system, for use with a three-dimensional work area that has a horizontal work surface, said system, characterized by the fact that it comprises: a digital camera, a projector and a mirror, an infrared camera and a deflector of rays housed together, as a single unit, in which, when the single unit is used for use with the work area: the digital camera is positioned above the projector and defines a three-dimensional capture space, within which the camera can effectively capture images, the capture space limited in two dimensions by the capture area of the work surface; the projector is positioned below the camera; the mirror is positioned above the projector and configured to reflect light from the projector, passing through the ray deflector, down on the work surface; the projector and mirror define a three-dimensional display space limited in two dimensions by a display area, on the work surface, overlapping at least part of the capture area; the infrared camera and the projector are housed together, together with the ray deflector, in the single housing, so that the infrared camera's display space and capture area are coincident; the single housing is positioned below the digital camera and above the work surface; the mirror is outside the single housing; and the infrared light emitted on said work surface is reflected from the mirror towards the projector, and the reflected infrared light is directed to the infrared camera by the ray deflector and detected by the infrared camera; the digital camera, the projector and the mirror are positioned, in relation to each other, so that the light from the projector, reflected from the mirror, illuminates the capture space and the glare point of the reflected light is outside the capture space ; and a user input device operationally connected to the digital camera and the projector and configured to allow the user to interact with the system, on the desktop, the user input device emitting infrared light. [0013] 13. The system according to claim 12, characterized by the fact that the user's input device includes a digital infrared pen to selectively emit infrared light within the work area. [0014] 14. The system, according to claim 12 or 13, characterized by the fact that it also comprises a controller operationally connecting the digital camera and the projector and configured to control the digital camera and the projector, for: the projector illuminates the capture space; the digital camera captures an image of an object positioned on the work surface in the capture area and illuminated by the projector; the projector projects the image of the object on the work surface in the display area; and the digital camera captures an image of the projected object image. [0015] 15. Projection capture method employing an image capture camera, a projector, a mirror, an infrared camera and a ray deflector housed together as a single unit for use with a work surface, characterized by the fact of understanding: positioning the image capture camera above the projector to establish a camera capture area, within which the camera can capture an image of an object on the work surface; position the projector below the image capture camera; position the mirror configured to reflect the light from the projector, passed through the beam deflector, down on the work surface above the projector to establish a projector display area, overlapping the capture area and onto which a projector can project light; housing the infrared camera and the projector together, together with the ray deflector, in the single housing, so that the projector's field of view and the infrared camera's field of view are coincident; reflect the infrared light emitted from an input device in the said field of view of the mirror towards the projector and direct the reflected infrared light to the infrared camera by the ray deflector and detect it by the infrared camera; illuminate the camera's capture area with the projector; and positioning a specular glare point from the projector, illuminating the exterior of the camera's capture area, where the single housing is positioned below the digital camera and above the work surface, and the mirror remaining outside the single housing. [0016] 16. Method according to claim 15, characterized by the fact that the input device includes a pen to selectively emit infrared light. [0017] 17. Method according to claim 15 or 16, characterized in that the positioning of the specular glare point outside the camera's capture area includes a projector projecting light towards a mirror and the mirror reflecting light in the capture area the camera. [0018] 18. Method according to claim 15 or 16, characterized in that the positioning of the specular glare point outside the camera's capture area includes doubling the light path, from the projector, to the camera's capture area , along a fold line. [0019] 19. Method according to claim 18, characterized in that the fold line is defined by a light-reflecting surface.
类似技术:
公开号 | 公开日 | 专利标题 BR112014002463B1|2020-12-08|projection capture systems, interactive projection capture system and projection capture method US9521276B2|2016-12-13|Portable projection capture device US9560281B2|2017-01-31|Projecting an image of a real object JP6078884B2|2017-02-15|Camera-type multi-touch interaction system and method CN107426503B|2020-04-28|Intelligent lighting device JP5950130B2|2016-07-13|Camera-type multi-touch interaction device, system and method JP6175866B2|2017-08-09|Interactive projector EP0055366B1|1987-03-18|System for remotely displaying and sensing information using shadow parallax JP2006003414A|2006-01-05|Exhibiting device TWI439785B|2014-06-01|Multi resolution display system JP5797995B2|2015-10-21|Sheet with reflection frame BR112014002448B1|2021-12-07|PROJECTION CAPTURE DEVICE
同族专利:
公开号 | 公开日 CN103828341A|2014-05-28| EP2740259A1|2014-06-11| US20140139668A1|2014-05-22| EP2740008A1|2014-06-11| CN103827744B|2016-04-06| WO2013019252A1|2013-02-07| JP6068392B2|2017-01-25| KR101825779B1|2018-02-05| JP2014529925A|2014-11-13| CN103827744A|2014-05-28| JP6059223B2|2017-01-11| JP2014239441A|2014-12-18| EP2740259A4|2015-04-22| WO2013019217A1|2013-02-07| KR20140068902A|2014-06-09| CN103828341B|2017-02-15| KR20140054146A|2014-05-08| IN2014CN00543A|2015-04-03| BR112014002463A2|2017-02-21| BR112014002448A2|2017-02-21| EP2740008A4|2015-09-16| KR101787180B1|2017-10-18|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US2529664A|1948-02-14|1950-11-14|Ward Hickok|Visual educational projector| EP0622722B1|1993-04-30|2002-07-17|Xerox Corporation|Interactive copying system| JPH08336065A|1995-06-09|1996-12-17|Nikon Corp|Original illuminator and image input device| US5639151A|1996-02-16|1997-06-17|Mcnelley; Steve H.|Pass-through reflective projection display| GB9614837D0|1996-07-12|1996-09-04|Rank Xerox Ltd|Interactive desktop system with multiple image capture and display modes| US6095661A|1998-03-19|2000-08-01|Ppt Vision, Inc.|Method and apparatus for an L.E.D. flashlight| EP1203489B1|1999-08-10|2008-07-09|Teleportec Inc|Communications system| JP4023048B2|1999-10-05|2007-12-19|株式会社日立製作所|Rear projection image display system| US6618076B1|1999-12-23|2003-09-09|Justsystem Corporation|Method and apparatus for calibrating projector-camera system| US6965460B1|2000-08-08|2005-11-15|Hewlett-Packard Development Company, L.P.|Method and system for scanning an image using a look-down linear array scanner| WO2002043390A2|2000-11-06|2002-05-30|Jianbo Shi|Paper-based remote sketching system| JP4608766B2|2000-11-27|2011-01-12|ソニー株式会社|Method for driving solid-state imaging device and camera| US6431711B1|2000-12-06|2002-08-13|International Business Machines Corporation|Multiple-surface display projector with interactive input capability| US7710391B2|2002-05-28|2010-05-04|Matthew Bell|Processing an image utilizing a spatially varying pattern| US6554434B2|2001-07-06|2003-04-29|Sony Corporation|Interactive projection system| JP2003152851A|2001-11-14|2003-05-23|Nec Corp|Portable terminal| JP2003276399A|2002-03-25|2003-09-30|Matsushita Electric Ind Co Ltd|Position detecting method and device and electronic blackboard device| US6840627B2|2003-01-21|2005-01-11|Hewlett-Packard Development Company, L.P.|Interactive display device| JP2004265185A|2003-03-03|2004-09-24|Canon Inc|Image projecting device with camera/with digitizer| JP2004320123A|2003-04-11|2004-11-11|Nec Viewtechnology Ltd|Data presentation apparatus| US20050068442A1|2003-09-26|2005-03-31|Corey Billington|Presentation system and method of use| JP2005128413A|2003-10-27|2005-05-19|Pentax Corp|Illumination controller| US7496229B2|2004-02-17|2009-02-24|Microsoft Corp.|System and method for visual echo cancellation in a projector-camera-whiteboard system| JP2005250392A|2004-03-08|2005-09-15|Olympus Corp|Camera| JP2005354306A|2004-06-09|2005-12-22|Brother Ind Ltd|Image input output apparatus| JP2006004010A|2004-06-15|2006-01-05|Brother Ind Ltd|Image input/output device| US7519223B2|2004-06-28|2009-04-14|Microsoft Corporation|Recognizing gestures and using gestures for interacting with software applications| US7557966B2|2004-08-11|2009-07-07|Acushnet Company|Apparatus and method for scanning an object| JP4433960B2|2004-09-21|2010-03-17|株式会社ニコン|Portable information equipment| JP3901185B2|2004-09-27|2007-04-04|カシオ計算機株式会社|Projection apparatus, projection image capturing method, and program| EP1686554A3|2005-01-31|2008-06-18|Canon Kabushiki Kaisha|Virtual space generating system, image processing apparatus and information processing method| EP1898260A4|2005-06-30|2014-07-09|Ricoh Co Ltd|Projection image display device| US7614753B2|2005-10-31|2009-11-10|Hewlett-Packard Development Company, L.P.|Determining an adjustment| US7599561B2|2006-02-28|2009-10-06|Microsoft Corporation|Compact interactive tabletop with projection-vision| US8649848B2|2006-03-28|2014-02-11|The United States Of America, As Represented By The Secretary Of The Air Force|Synchronization of illumination source and sensor for improved visualization of subcutaneous structures| JP4697087B2|2006-08-09|2011-06-08|富士ゼロックス株式会社|Image processing device| US7690795B2|2006-10-06|2010-04-06|Hewlett-Packard Development Company, L.P.|Projector/camera system| US7907781B2|2007-06-21|2011-03-15|Mitsubishi Electric Research Laboraties, Inc.|System and method for determining geometries of scenes| US9377874B2|2007-11-02|2016-06-28|Northrop Grumman Systems Corporation|Gesture recognition light and video image projector| JP5504570B2|2008-03-27|2014-05-28|カシオ計算機株式会社|Projector with built-in camera and photographing method of projector with built-in camera| JP5277703B2|2008-04-21|2013-08-28|株式会社リコー|Electronics| JP5309724B2|2008-06-24|2013-10-09|船井電機株式会社|projector| US8355038B2|2009-01-28|2013-01-15|Hewlett-Packard Development Company, L.P.|Systems for capturing images through a display| US8619178B2|2009-01-28|2013-12-31|Hewlett-Packard Development Company, L.P.|Image rendition and capture| JP2010224015A|2009-03-19|2010-10-07|Sanyo Electric Co Ltd|Projection video display device, writing board, and projection video system| JP2010238213A|2009-03-30|2010-10-21|Plus Vision Corp|Tablet pc system and electronic writing sheet| JP5395507B2|2009-05-21|2014-01-22|キヤノン株式会社|Three-dimensional shape measuring apparatus, three-dimensional shape measuring method, and computer program| NO331527B1|2009-12-23|2012-01-23|Cisco Systems Int Sarl|Method of removing flickering in video recording| US8842096B2|2010-01-08|2014-09-23|Crayola Llc|Interactive projection system| US8490002B2|2010-02-11|2013-07-16|Apple Inc.|Projected display shared workspaces| US8751049B2|2010-05-24|2014-06-10|Massachusetts Institute Of Technology|Kinetic input/output| US8928735B2|2011-06-14|2015-01-06|Microsoft Corporation|Combined lighting, projection, and image capture without video feedback|US9582083B2|2011-12-22|2017-02-28|Apple Inc.|Directional light sensors| JP6175866B2|2013-04-02|2017-08-09|富士通株式会社|Interactive projector| US20160077670A1|2013-07-31|2016-03-17|Hewlett-Packard Development Company, L.P.|System with projector unit and computer| EP3036602A4|2013-08-22|2017-04-12|Hewlett-Packard Development Company, L.P.|Projective computing system| US9569892B2|2013-09-26|2017-02-14|Qualcomm Incorporated|Image capture input and projection output| CN104656869B|2013-11-20|2018-07-06|联想有限公司|A kind of data processing method and electronic equipment| JP5841587B2|2013-12-25|2016-01-13|株式会社Pfu|Imaging system| CN106255938B|2014-02-28|2019-12-17|惠普发展公司, 有限责任合伙企业|Calibration of sensors and projectors| JP2015215416A|2014-05-08|2015-12-03|富士通株式会社|Projector device| EP3147766A4|2014-05-23|2018-03-21|Cinepic TechnologyLimited|Interactive display system| WO2016018406A1|2014-07-31|2016-02-04|Hewlett-Packard Development Company, L.P.|Image projection and capture with adjustment for white point| US20170219915A1|2014-07-31|2017-08-03|Hewlett-Packard Development Company, L.P.|White flash generation from a light emitting diodeprojector| WO2016018416A1|2014-07-31|2016-02-04|Hewlett-Packard Development Company, L.P.|Determining the location of a user input device| US20170223342A1|2014-07-31|2017-08-03|Hewlett-Packard Development Company. L.P.|Object capture and illumination| CN107113417B|2014-08-01|2020-05-05|惠普发展公司,有限责任合伙企业|Projecting an image onto an object| CN106796446B|2014-08-04|2020-05-12|惠普发展公司,有限责任合伙企业|Workspace metadata management| CN107005645A|2014-09-10|2017-08-01|惠普发展公司,有限责任合伙企业|The image projection shown while with LED light and capture| EP3191918B1|2014-09-12|2020-03-18|Hewlett-Packard Development Company, L.P.|Developing contextual information from an image| WO2016118173A1|2015-01-23|2016-07-28|Hewlett-Packard Development Company, L.P.|Tracking a handheld device on surfaces with optical patterns| EP3286718A4|2015-04-23|2018-12-05|Hasbro, Inc.|Context-aware digital play| US9659371B2|2015-10-08|2017-05-23|Christie Digital Systems Usa, Inc.|System and method for online projector-camera calibration from one or more images| CN106210702A|2016-08-30|2016-12-07|四川川大智胜软件股份有限公司|A kind of synchronization three dimensional data collection instrument, acquisition system and acquisition method| EP3550330A4|2016-11-29|2019-10-30|Panasonic Intellectual Property Management Co., Ltd.|Distance measuring device| WO2018136029A1|2017-01-17|2018-07-26|Hewlett-Packard Development Company, L.P.|Input device with touch sensitive surface| CN106791747A|2017-01-25|2017-05-31|触景无限科技(北京)有限公司|The time-sharing handling method of desk lamp interaction display, device and desk lamp| JP6399135B1|2017-03-30|2018-10-03|日本電気株式会社|Image input / output device and image input / output method| CN112997114A|2018-09-26|2021-06-18|惠普发展公司,有限责任合伙企业|Image capturing assembly|
法律状态:
2018-12-26| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2019-10-15| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2020-03-24| B25G| Requested change of headquarter approved|Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. (US) | 2020-10-20| B09A| Decision: intention to grant [chapter 9.1 patent gazette]| 2020-12-08| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 02/08/2011, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 PCT/US2011/046253|WO2013019217A1|2011-08-02|2011-08-02|Projection capture system and method| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|