专利摘要:
A display device of an image sequence in the form of a pixel matrix comprises a control unit coupled to a projector. The control unit includes an input interface configured to receive asynchronous information representing, for each pixel of the array, events relating to the pixel, and a processor configured to control the activation of each pixel of the array to instants determined by respective events indicated by the asynchronous information for said pixel. The projector is arranged on a support so as to illuminate photoreceptors of the eye during use of the device, and configured to project a luminous flux corresponding to the pixels activated by the control unit.
公开号:FR3019317A1
申请号:FR1452558
申请日:2014-03-26
公开日:2015-10-02
发明作者:Ryad Benosman;Guillaume Chenegros;Serge Picaud;Siohoi Ieng
申请人:Centre National de la Recherche Scientifique CNRS;Universite Pierre et Marie Curie Paris 6;
IPC主号:
专利说明:

[0001] The present invention relates to devices for visual information display from an asynchronous information.
[0002] Devices having the shape of glasses allowing a user to view a scene captured by a camera on a miniature screen, fixed on the frame or arranged on a glass, have been marketed for a few years. The camera is generally attached to the spectacle frame so as to capture a scene in the user's field of view, for example for augmented reality applications. There are also devices for assisting vision for users equipped with a visual implant responding to electrical stimulation. Some of these devices are in the form of glasses on which are mounted a camera which produces a video stream sent to a video processing unit which encodes this stream to generate an electrical stimulation signal, and an external coil for inductive transmission. electromagnetic signal to an internal coil connected to a retinal implant. These existing devices, however, can not be used by people equipped with a visual photodiode implant, or people who have benefited from optogenetic treatment. Indeed, implants or optogenetic treatment processes require signals of light stimulation much higher intensity than that of ambient light. A vision aid implant placed under the eye, typically comprising an electrode around which are arranged one to three photodiodes, will work effectively only if these photodiodes receive a light seven times more powerful than that of ambient light, so that the photodiodes can emit a stimulus. Likewise, current optogenetic treatments are only fully effective if the treated eye receives light signals having a specific wavelength and a light intensity ranging from two to seven times that of ambient light. The required light powers are therefore so high that the use of display methods implemented on conventional devices, at these power levels, would cause damage to the visual organs of users. There is thus a need for display devices of a scene not having the disadvantages of the conventional methods set forth above. In particular, a first need is to provide display devices for a scene that can be used for applications in the field of vision assistance. Another need is to provide scene visualization devices that are compatible with use by persons equipped with a photodiode visual implant and / or persons having received optogenetic treatment. According to a first aspect, there is provided a device for displaying an image sequence in the form of a matrix of pixels, comprising a control unit operatively coupled to a projector, the control unit comprising an interface of input configured to receive asynchronous information representing, for each pixel of the array, events relating to the pixel and a processor configured to control the activation of each pixel of the array at times determined by respective events indicated by the asynchronous information for said pixel. The projector of the proposed display device is arranged on a support so as to illuminate photoreceptors of the eye during use of the device, and configured to project a luminous flux corresponding to the pixels activated by the control unit. The events concerning the pixel may according to the embodiment correspond to variations of light for the pixel, to the detection of a shape of interest or to the detection of a primitive, and more generally to any type of asynchronous information. for the pixel. The use of asynchronous information representing events to generate commands for activating the pixels of a matrix has many advantages. These result notably from the fact that these signals are not sampled in time according to a predefined clock frequency, such as the clock of the frames in a conventional video signal. They provide what is called an address-event representation (AER) of a scene to be visualized. Each pixel corresponds to an event-based signal sequence ("event based"). Processes for acquiring or synthesizing a frame image sequence have the disadvantage of producing data with high redundancy, due to the fact that each frame represents a large number of pixels of an image if it is not is an entire image, and that all of these pixels for which the information does not change from one image to another, generates redundancies in the data representing the sequence of images. This redundancy can only be partially suppressed by a compression encoding of a conventional video signal. Inversely, the asynchronous signals make it possible to obtain a very compact representation of data relating to a sequence of images, because these data, representing events for a pixel (and not for all the pixels of a matrix or a large number of them), are not redundant from one image to another. The activation of a pixel from asynchronous information can be performed respecting - at a time resolution close - the asynchronous nature of the sequence of events, so as to achieve an activation which is controlled on an event (" event driven "). Thus, the proposed device allows the projection to photoreceptors of the user's eye (photoreceptors naturally present in the eye and / or photoreceptors of a visual implant) of a luminous flux corresponding to the asynchronously activated pixels. The asynchronous activation of the pixels makes it possible to activate simultaneously only a small number of pixels (for example a single pixel or group of co-located pixels) and consequently to stimulate by a luminous flux only a local portion of the zone. of photoreceptors. The light intensity of such a flow, aimed only at the stimulation of a localized area, can then be raised to levels required for the intended application. In particular, the small amount of data representing a sequence of images of an AER-like asynchronous signal makes it possible to increase the intensity of the photoreceptor excitation light signals of a visual prosthesis or a visual organ to which an optogenetic treatment was applied. In one embodiment, the control unit may further be configured to, after activating a pixel of the array at an instant determined by an event indicated by the asynchronous information, repeat the activation command of said pixel substantially at the same activation level at times defined by a refreshing sequence. The activation of a pixel from asynchronous information thus makes it possible to consider the activation of a pixel only when an event corresponding to this pixel is detected in the input data representing, for example, a sequence of images, and to perform refreshing activations at a much lower frequency than conventional display methods. In one embodiment, receipt of the asynchronous information may include receiving a signal carrying asynchronous information, and activation control of a pixel comprising detecting in the signal information representing a event. In addition, the refresh sequence may define pixel activation times separated by a time interval. This time interval between an event activation and a refreshing activation, or between two refreshing activations, may for example be determined according to the retinal persistence of the human eye. The retinal persistence of the human eye constitutes a threshold limit that it is preferable not to exceed to perform a display of refreshing the pixel, at the risk of deteriorating the visual comfort of the user. For example, this time interval will be chosen between 40 ms and 800 ms, and preferably between 40 ms and 150 ms, to avoid the effects of flicker, knowing that a longer time interval corresponds to a refresh rate lower and a decrease in the flow of display commands and associated calculations. According to one embodiment of the device, the projector comprises a matrix of micro-mirrors, a control unit of the matrix of micro-mirrors, a control input for receiving the pixel activation commands, and an optical input for receiving a luminous flux. According to one embodiment of the device, the projector support is in the form of a pair of glasses, the projector being placed on a surface zo glasses. In another aspect, there is provided a scene display system, including a display subsystem operatively coupled to an acquisition subsystem, wherein the display subsystem includes a device according to the present invention. first aspect, and wherein the acquisition subsystem comprises a scene-facing sensor operatively coupled to a processing unit configured to generate asynchronous information representing for each pixel events. According to one embodiment of the system, the sensor is a light sensor comprising a scene acquisition optics and a matrix 30 of photosensitive elements. According to one embodiment of the system, the sensor is mounted on the projector support so that the captured scene substantially corresponds to the visual field of a user of the device. According to one embodiment of the system, the projector support is in the form of a pair of glasses, the projector is mounted on a first surface of the glasses, the sensor is mounted on the upper part of the spectacle frame, and the control unit of the device and the processing unit of the acquisition subsystem are mounted on a second surface of the glasses. Alternatively, there may be provided a system in which the projector support is in the form of a pair of glasses, the projector is mounted on a first surface of the glasses, and the sensor, the control unit of the device and the processing unit of the acquisition subsystem are mounted on a second surface of the glasses. Other features and advantages of the present invention will emerge in the following description of nonlimiting exemplary embodiments, with reference to the appended drawings, in which: FIG. 1 is a block diagram of a display device of a sequence of images according to an embodiment of the proposed device; Figure 2 is a block diagram of a projection device according to an embodiment of the proposed device; FIG. 3 is a block diagram of an optical subsystem of a projection device according to an embodiment of the proposed device; FIGS. 4a, 4b and 4c are synoptic diagrams of an optical subsystem of a projection device according to particular embodiments of the proposed device; Figs. 5a, 5b, 5c, 5d illustrate pairs of spectacles on which are mounted an image sequence display device and a light sensor according to embodiments of the proposed device and system; Figures 6a and 6b are diagrams illustrating pixel activation control methods implemented by the device and system provided in one embodiment; FIGS. 7a and 7b are diagrams showing a temporal sequence of events received in an asynchronous signal for the implementation of the proposed device and system; Figures 7c and 7d are diagrams showing a timing sequence of pixel activation commands generated according to particular embodiments of the proposed device and system; Fig. 8 is a block diagram of a scene display system according to an embodiment of the proposed system; Fig. 9a is a block diagram of a light acquisition device adapted to generate an asynchronous signal according to an embodiment of the proposed system; Fig. 9b is a diagram showing an exemplary light intensity profile at a pixel of an asynchronous sensor; FIG. 9c shows an example of a signal delivered by the asynchronous sensor in response to the intensity profile of FIG. 9b; Figure 9d illustrates the reconstruction of the intensity profile from the signal of Figure 9c; In the following detailed description of embodiments of the invention, many specific details are presented to provide a more complete understanding. Nevertheless, those skilled in the art may realize that embodiments can be practiced without these specific details. In other cases, well-known features are not described in detail to avoid unnecessarily complicating the description.
[0003] The invention will be described hereinafter in the nonlimiting context of an asynchronous information representing, for a pixel of a matrix of pixels, events corresponding to light variations for the pixel. The devices and systems proposed are however not limited to this particular embodiment, the events concerning the pixel being able according to the embodiment to correspond to variations of light for the pixel, to the detection of a form of interest or to the detection of a primitive, and more generally to any type of asynchronous information for the pixel. Figure 1 shows an image sequence display device 100 comprising a control unit 102 for controlling the activation of pixels, and a projector 104 for projecting a luminous flux onto a user's eye. In the following, the images to be viewed are considered in the form of a matrix of elementary objects called pixels. In one embodiment of the proposed device, the control unit 102 includes an input interface 101 for receiving asynchronous information. The asynchronous information received on the input interface (101) represents, for each pixel of the matrix, events corresponding to light variations for the pixel. It thus corresponds to a sequence of images, each considered in the form of a matrix of pixels. The input interface 101 may be configured to receive the asynchronous information in different forms, or formats, corresponding to different embodiments of the device. It can also be provided in accordance with various standard formats, such as, for example, the Universal Serial Bus (USB) format. The proposed device is not limited to a particular asynchronous information format, vector of this information (for example, an asynchronous signal carrying information representative of an event flow), or a specific format of input interface.
[0004] Asynchronous information can indeed be generated by various means. For example, it may be carried by an asynchronous signal produced by an asynchronous vision sensor and received on the input interface (101). It can also result from the encoding of a sequence of synthesis images producing a set of data received on the interface (101).
[0005] In general, the asynchronous information represents events relating to the pixels of the matrix. In a particular embodiment, the asynchronous information indicates, or signals, events relating to one or more pixels. For example, it can include data that identifies events with their respective characteristics.
[0006] According to one embodiment, the asynchronous information for a pixel may be carried by a succession of pulses or lines, positive or negative, positioned in time at times tk depending on the light profile for the pixel. These lines can be represented mathematically by positive or negative Dirac peaks and each characterized by a transmission instant tk and a sign bit. The information corresponding to an event for a pixel, may comprise a first information relating to a moment of occurrence of the event, and a second piece of information relating to a luminous characteristic for the pixel at this moment.
[0007] The shape of the asynchronous information for a pixel may be different from a succession of Dirac peaks, the events represented being able to have a temporal width or an amplitude or any waveform. The input interface 101 thus receives in this embodiment, for each pixel of position p = (x; y), data representing a series of binary pulses, which can be modeled by ON or OFF generated asynchronously at respective times tk. The information corresponding to an event includes position information of the pixel for which the event has occurred (for example, the pair (x; y) of row and column numbers in the pixel array), information from the occurrence time of the event (e.g., discrete time value relative to a reference), and event type information (e.g., a bit for encoding events of two types). In a particular embodiment, the input interface 101 may be configured to receive an asynchronous signal carrying information representative of a stream of events, each defined by a quadruplet e (x; y; t; ON / OFF) giving the position p = (x; y) of the pixel to which the event is associated, the time t at which the event was detected, and the type (ON or OFF) of the event. In another particular embodiment, the asynchronous signal received at the input of the interface 101 carries information representative of a flow of events where each event is defined by a quadruplet e (x; y; t; g) giving the position p = (x; y) of the pixel with which the event is associated, the time t at which the event has been detected, and a gray level g associated with the event. The control unit 102 further comprises a data processing unit 102a having a processor 103a operatively coupled to a memory 103b and to the input interface 101. Depending on the embodiment, the memory 103b may contain software instructions which, when executed by the processor 103a of the data processing unit 102a, cause this unit 102a to perform or control the input interface portions 101, pixel activation control according to the various methods described herein, and transmission of commands to the projector 104. The processing unit 102a may be a component implementing a processor or a calculation unit for generating pixel activation commands according to the various methods described and the control of the input interfaces 101 and the projector 104 of the device 100. In addition, the input interface 101, the processing unit 102a, the processor 103a and / or the memory means 103b can be, separately or jointly, implemented in software form, as described above, in hardware form, as an application specific integrated circuit (ASIC), or in form of a combination of hardware and software, such as one or more software programs intended to be loaded and executed respectively on one or more FPGA components (Field Programmable Gate Array). They can be implemented, separately or jointly, in the form of an electronic circuit, or within one or more electronic components (in English, chip or chipset). Likewise, different modes of implementation of the device 100 can be envisaged. For example, the input interface 101, the data processing unit 102a and / or the memory means 103b can be implemented on an asynchronous information processing electronic module operatively coupled to an electronic projection module which implements the components of the projector 104. The control unit 102 generates pixel activation commands whose format is adapted to the input interface of the projector 104.
[0008] In a particular embodiment of the device, the output interface of the control unit 102 is configured to generate a format output signal adapted to control a video projector using the micro-mirror matrix technology (in English DMD , for "Digital Micromirror Device"). This type of projector is sometimes referred to by the acronym DLP, for "Digital Light Processing", and works with a light source that illuminates a matrix of micro-mirrors that vibrate according to the amount of light to reflect. The vibrations of each mirror are performed around two positions respectively corresponding to inclinations around an axis, one in which the light is reflected by the mirror to an output optic, and the other in which the light is reflected by the mirror to an absorbing surface and is not projected. Each mirror of the DMD array projects light for a pixel of the pixel array forming the projected image. Some DLP projectors are further capable of illuminating pixels with different gray levels, and can therefore receive activation commands for each pixel including gray level information with which the pixel is to be illuminated. For example, it would be possible to use a DLP projector capable of handling 1024 gray levels, and to provide it with activation commands according to the methods described in which the illumination gray level of a pixel is coded on 10 bits. FIG. 2 shows an exemplary implementation of the projector 104 according to a particular embodiment of the proposed device using a DLP projector. The projector 104 includes an input interface 111, a DMD matrix control unit 112, and an optical subsystem 113 including the DMD array. The input interface 111 is operatively coupled to the control unit 102 of Fig. 1, and is adapted to receive pixel activation commands generated by the control unit 102. This coupling may be provided conforms to different standard formats, such as, for example, Universal Serial Bus (USB). The DMD matrix control unit controls the positioning of each mirror of the DMD matrix according to the pixel activation commands received by the interface 111.
[0009] The spatial resolution for the control of the micro-mirrors can be chosen according to the intended application and the size of the micro-mirror array of the DMD used. For example, one can maintain a spatial resolution of the order of one pixel and illuminate a pixel without activating its neighbors, and thus activate only one micro-mirror. It is also possible, in another embodiment, to group pixels in packets, for example 3 x 3 pixels, to generate different gray levels as a function of the number of pixels activated in the packet, and to activate the corresponding micro-mirrors. Furthermore, the input interface 111 may be configured to receive pixel activation commands in different forms, or formats, corresponding to different embodiments of the device. The proposed device is not limited to a particular pixel activation control format, or a specific input interface format. The format of the pixel activation commands will preferably be compatible with the input interface 111 and the control unit 112.
[0010] For example, the pixel activation commands received by the input interface 111 may, in one embodiment, relate to only one pixel of the matrix, since the input interface 111 and the control unit 112 are capable of operating in individual pixel addressing mode. However, in another embodiment of the device, it is possible to use a projector which operates in frame mode, taking advantage of the high values (in comparison with other types of projectors) of temporal resolution of the existing DLP projectors. (1440 Hz, 4 KHz, 22 KHz). In this mode of operation, the activation commands will include frames in which only the object pixels of the enable command are turned on. The control unit 112 comprises a data processing unit comprising a processor operably coupled to a memory (not shown in the figure). Depending on the embodiment, the memory may contain software instructions which, when executed by the processor of the data processing unit, cause that unit to perform or control the input interface 111 and control portions. DMD. The processing unit may be a component implementing a processor or a computing unit for generating DMD control commands and controlling the input interface 111. In addition, the input interface 111 and the different components of the control unit 112 may, separately or jointly, be implemented in software form, as described above, in hardware form, as an application specific integrated circuit (ASIC), or in the form of a combination of hardware and software elements, such as for example one or more software programs intended to be loaded and executed respectively on one or more FPGA components (Field Programmable Gate Array). They can be implemented, separately or jointly, in the form of an electronic circuit, or within one or more electronic components (in English, chip or chipset). Figure 3 illustrates the architecture of the optical subsystem of the projector 104 according to a particular embodiment of the proposed device. FIG. 3 shows a light source (20) arranged to emit in the direction of a micro-mirror array (21) an optical projection signal. This optical projection signal is adapted according to the application implemented by the device and the regulations in force. For example, in the case of the use of the device by a person equipped with a visual photodiode implant, the optical projection signal will be adapted to provide a luminous intensity per pixel that stimulates the photodiodes of the implant. Similarly, in the case of the use of the device by a person who has benefited from optogenetic treatments, the projection optical signal will be adapted to provide a luminous intensity per pixel at specific wavelengths of this application that can stimulate the treated area. The light source (20) is thus chosen with in particular a spectral range and an intensity which corresponds to the intended application. The light source (20) may for example be a laser source, or an incoherent light source. It is chosen preferably portable, so that the user of the proposed device can easily transport the device and all the accompanying elements (light source, power supply (s), etc.).
[0011] In addition, the light source (20) can be deported and the resulting projection optical signal be transported to illuminate the matrix of micro-mirrors, for example by an optical fiber (not shown in the figure). In this case, the light source (20) may comprise a laser diode coupled to an optical fiber for transporting the light signal to the DMD component.
[0012] A first collimation lens (22) makes it possible to obtain a projection optical signal focused on the entire surface of the micro-mirror array used. This lens will therefore be arranged and positioned according to the selected DMD component (21) and the light source (20). The light beam from the first lens (22) illuminates the micro-mirror array, which is positioned so that the micro-mirrors, positioned to reflect light to an output optic, reflect the light in a manner that direction chosen to illuminate the area of the eye in which are located the photodetectors that are to be stimulated (whether naturally occurring photodetectors, whether or not treated with optogenetics, or photodetectors of a implant). An arrangement of the optical subsystem will preferably be chosen which makes it possible to illuminate the foveal zone of the eye locally. In one embodiment, the micro-mirror array is positioned so that the micromirrors reflect light to an output optic in a direction substantially aligned with the optical axis of the eye. A second collimation lens (23) makes it possible to collect the light reflected by a micro-mirror or a group of micro-mirrors co-located in a light beam of section corresponding in size to that of the area of the eye to be illuminated. . Preferably, the second lens will be arranged so that the resulting light beam locally illuminates this area of the eye on a portion corresponding to the position of the micro-mirror or group of micromirrors in the DMD matrix. .
[0013] The optics of the eye indeed functions as a focusing lens whose object focal plane corresponds to the foveal area. The beam from the second lens 23 is thus focused to locally stimulate the foveal area. It is also possible, in a particular embodiment, to add to the second lens 23 a focusing lens for correcting vision defects, such as for example a liquid lens making it possible to focus on objects that are located at a distance can go from ten centimeters to infinity. This liquid lens can be controlled for example by a potentiometer allowing the user of the device to choose from several modes of vision, such as near vision, vision at 1.50 m and vision from afar. In one embodiment, the beam from the second lens 23 is sized to illuminate an area a little wider than the foveal area. The beam may for example have a diameter of the order of 4.5 mm to prevent loss of illumination of the foveal area when the eye moves. Figure 4a illustrates a first embodiment of the optical sub-system of the projector. FIG. 4a shows a collimation lens 32 combined with a lens array (in English, microlens array) 35 for focusing the light of a light beam 34a issuing from a transport optical fiber coupled to a light source ( not shown in the figure), in a projection optical signal 34b capable of illuminating all the micro-mirrors of the micro-mirror array (DMD) 31 with the characteristics (in particular power and wavelength) required.
[0014] The matrix of lenses makes it possible to limit the diffraction generated by the large number of micro-mirrors. Thus, in one embodiment of the optical sub-system of the projector using a DMD component, a lens matrix may be inserted in the light path, possibly combined with a modulation performed on the optical fiber, in order to reduce the effect of mottling.
[0015] An optical block composed of a lens 36 and an optical prism 37 makes it possible to deviate the direction of the optical projection signal by an angle corresponding to the pivot angle of the mirrors of the DMD according to their pivot axis. For example, some DMD components operate with micro-mirrors that rotate about an axis at an angle of 12 °. They therefore reflect light in a direction 12 ° apart from the axis perpendicular to the matrix. Light emission to the matrix at an angle of 12 ° compensates for this difference. The optical projection signal 34b is then reflected on a dichroic mirror 38 (in English, polarized beam splitter) to be directed towards the matrix of micro-mirrors 31. A quarter-wave plate 39 is placed between the matrix of micro-mirrors 31 and the dichroic mirror 38 on the path of the optical stimulation signal 34c resulting from the reflection by the matrix of micro-mirrors 31 to modify the signal polarization 34c so that it can pass through the mirror dichroic 38 to the output optics. The output optics comprises an optical window 40 and a second collimating lens 33 playing a role corresponding to that of the second collimating lens 23 of FIG. 3. FIG. 4b illustrates a second embodiment of the optical subsystem of FIG. projector. The light beam 54a, for example generated by a laser diode and transported by optical fiber to the optical subsystem of the projector, is reflected by an optical prism 52 located at the input of the optical subsystem. FIG. 4b shows a collimation lens 53, possibly combined with a lens matrix, for concentrating the light of the light beam 54a issuing from a transport optical fiber into an optical projection signal 54b able to illuminate all the micro-mirrors of the micro-mirror array (DMD) 51 with the characteristics (in particular power and wavelength) required. An optical block composed of a dichroic mirror 59, a first optical prism 55 separated from a second optical prism 57 by a lens 56 makes it possible to guide the projection optical signal 54b towards the micro-mirror array of the DMD component 51. As previously, the projection gap introduced by the DMD component is compensated by positioning the latter away from a corresponding angle (in the example illustrated in the figure, the angular difference is 12 °). The path of the stimulation optical signal 54c from the reflection by the micro-mirror array 51 is the inverse of that of the projection optical signal 54b from the array of the DMD component to the dichroic mirror 59, where the optical signal stimulation 54c is reflected to the output optics. The output optics comprises an optical window block and collimating lens 58 playing a role corresponding to that of the second collimating lens 23 of FIG. 3. The elements 52, 53, 55, 58 and 59 described above are also shown in Figure 4c where we can see the path of the light beam 54a and the stimulation optical signal 54c within these components seen from another angle than that shown in Figure 4b. The size indications of the various components indicated in FIG. 4b illustrate an exemplary embodiment of the optical subsystem which makes it possible to obtain equipment of very small size, and compact enough to be mounted on the internal face of a spectacle lens. as described below. The dimensions and associated tolerances indicated in FIGS. 4b and 4c are expressed in mm. Returning to Figures 1 and 2, the projector 104 is mounted on a support so as to direct the optical stimulation signal from the optical sub-system of the projector to the area of the eye that is to be illuminated. This support may be in the form of a pair of spectacles 150, as illustrated in FIGS. 5a, 5b, and 5c, which show a projector module 104, 153 mounted on the inside face of a glass of a pair of glasses. glasses.
[0016] This support may also be in the form of a mask, or a helmet, and will preferably be shaped to be positioned on the head of the user, and the projector arranged on the support so as to illuminate photoreceptors of the eye when using the device. FIG. 5a shows a pair of spectacles 150 on which a projector module 153 is mounted. The projector module comprises an electronic subsystem and an optical subsystem that can be produced according to the implementation examples described above. The projector 153 illustrated in FIG. 5a comprises an optical fiber coupling interface 154, an output optic of the photoreceptor stimulation optical signal 155, and an optoelectronic block 156 comprising a matrix of micro-mirrors. The projector module may comprise a DLP projector controlled using the methods described hereinafter, which may for example comprise a DMD component of size 4.6 mm × 5.2 mm, and support a display resolution, that is to say a size of the image displayed in pixels, 608 x 684 pixels (corresponding to the WVGA standard) and a temporal resolution of up to 1440 Hz. These components are small enough to allow mounting of the projector module on a pair glasses as illustrated in Figures 5A, 5B and 5C. Hereinafter, there are described pixel activation control methods that can be implemented within the proposed device. Referring to Figs. 1 and 6a, the pixel activation control unit 102 independently drives each pixel of a pixel array for activation of that pixel. The device 100 receives (500) through the input interface 101 of the asynchronous information representative of events corresponding to light variations for the pixel. For example, for a position pixel (xo, yo) in the pixel matrix (pixel positioned on the index line xo and on the index column yo in a matrix MxN, with x E {0, ... , M - 1} and y E {0, ..., N - 1}, the information received will include asynchronous information for the position pixel (xo, yo) asynchronous information is processed by the data processing unit 102a for identifying an event for the pixel to control (501) a first activation of the pixel at an activation time determined by the identified event.
[0017] The identification of an event may target events characterized by a first information indicating an instant of occurrence of the event, and a second piece of information relating to a luminous characteristic for the pixel at a corresponding instant. For example, the identification of an event may include the detection of two peaks or pulses in a signal bearing the asynchronous information, the first indicating an instant of occurrence of the event and the second a gray level characteristic of the event for the pixel. In the embodiment in which the events are characterized by the instant of occurrence of a variation of light intensity beyond a threshold and the direction of this variation, the activation command may comprise a level of illumination of the determined pixel taking into account the detected variation applied to the illumination level of the previous pixel activation control. In the embodiment in which the events are characterized by the instant of occurrence of a variation of light intensity beyond a threshold and a gray level associated with this variation, the activation command may comprise a level of illumination of the pixel corresponding to the determined gray level following the detection of the event. This first activation of the pixel is therefore controlled on identifying an event for a pixel by using the asynchronous information received at the input of the display device of a sequence of images 100. In a particular embodiment, the activation is piloted as soon as an event is identified for a pixel, with the processing time required for the system 100 for the processing of information relating to the event. As a variant, the system 100 may maintain an activation time reference, on the basis of which the activations of the pixels will be driven at times respectively corresponding to the events identified for each pixel. As described above, each event can be characterized by a moment of occurrence and one or more values corresponding to respective light information (light intensity, gray level, color, etc.). In a particular embodiment of the proposed device, a second activation of the pixel may be controlled (502) following the first activation to repeat it at respective times defined by a refreshing sequence. The first activation can therefore be followed by one or more activations intended to refresh the activation activated on the identification of an event. It can be provided that the refresh sequence defines pixel activation times separated by a time interval. This time interval may be common to all the pixels of the matrix or defined for each pixel or for different subsets of pixels. It can in particular be determined according to the retinal persistence of the human eye. This makes it possible to choose refresh interval values large enough to avoid the display of redundant information at a high frequency to the detriment of the efficiency of the system, while taking into account the duration of the retinal persistence of the activation. previous. For example, the refresh time interval can be chosen between 40 ms and 150 ms, knowing that the higher the chosen value, the more efficient the pixel activation control will be by avoiding the redundancy activations all the more. .
[0018] A second activation of the pixel can therefore be controlled to perform a refresh of the activation of the pixel during the time interval thus determined and current from the activation of the pixel following the previous activation command. This previous activation command can be an activation command on event identification as described above, or a refresh activation command in the case for example where no activation on event identification has been controlled for a period corresponding to the current refresh time interval since the previous pixel activation. FIG. 6b illustrates a particular embodiment of the proposed method, in which the device 100 of FIG. 1 receives a signal carrying asynchronous information, processes this signal to detect events, and then generates transmitted pixel activation commands. to the projector 104 using a format appropriate to the latter. With reference to FIGS. 1 and 6b, the display device of an image sequence 100 receives (400) via the input interface 101 an asynchronous signal carrying information representative of events corresponding to light variations for the pixel. The asynchronous signal is processed by the data processing unit 102a to detect (401) information representing an event for the pixel. As described above, according to the embodiment of the proposed device, the detection of information representing an event may target events characterized by the instant of occurrence of a variation in light intensity beyond a threshold (which could be, in one embodiment, specific to the pixel) and the direction of this variation, or target events characterized by the moment of occurrence of a variation of light intensity beyond a threshold ( which may be, in one embodiment, specific to the pixel) and a gray level associated with this variation.
[0019] The information representing an event for the pixel is processed by the data processing unit 102a to generate (402) a first pixel activation control signal. The asynchronous character of the input signal of the display device of a sequence of images 100 will be better respected than the generation and transmission time of a pixel activation command from the moment when an event is detected for the pixel will be short. Real-time processing will thus be preferred in order to obtain an almost instantaneous activation as a function of the events detected for each pixel. The first activation command, generated on detection of an event for a pixel, is transmitted to the projector 104 to which the control unit 102 is interfaced. As described above, this first pixel activation, performed on detection of an event for the pixel, may be followed by a second activation to refresh the previous activation on event detection. In this case, the second activation gives rise to the generation (403) of a second activation command. This second activation command is generated in such a way that the activation that follows allows the refresh of the activation previously performed for the pixel. To do this, a refreshing time interval is determined whose duration defines the maximum time difference between two activations for one pixel. This time interval may be common to all the pixels of the matrix or defined for each pixel or for different subsets of pixels. It can for example be determined according to the retinal persistence of the human eye. This makes it possible to choose refresh interval values large enough to avoid the display of redundant information at a high frequency to the detriment of the efficiency of the system, while taking into account the duration of the retinal persistence of the previous activation. . A second activation control signal can therefore be generated zo (403) to perform a refresh of the activation of the pixel during the time interval thus determined and current from the activation of the pixel following the previous control of activation. This previous activation command may be an activation command on event detection as described above, or a refresh activation command in the case for example where no activation command on detection of event was generated for a period corresponding to the current refresh time interval since the previous activation. Figures 7a and 7b illustrate an asynchronous signal sequence for a pixel and the corresponding activation commands. FIG. 7a illustrates the information carried by an asynchronous signal sample in which the events for a pixel are represented on a time axis by Dirac peaks of amplitude G (t) corresponding to a gray level. Five events e1, e2, e3, e4, and e5 are represented in FIG. 7a, respectively positioned at such times as te2, te3, te4 and te5 with respect to a time reference t0, with such <te2 <te3 <te4 < TE5. Each of the 5 events carries a gray-level information for the considered pixel denoted g (t = tei), with i = {1, 2, 3, 4, 5}. These gray level values may for example result from the quantization a gray level value on 2nqu an t levels and be coded on nquant bits. FIG. 7b shows a simplified view of the sequence of events illustrated in FIG. 7a in which the events are represented by Dirac peaks of constant amplitude. The time interval Atp is defined according to the retinal persistence of the eye and so as to correspond to a maximum time interval separating two consecutive activations of the same pixel. In the example illustrated in FIG. 7b, the time differences separating two consecutive events do not exceed the quantity Atp, with the exception of the difference between the events e2 and e3 te3-te2. In the example of an event sequence illustrated in FIGS. 7a and 7b, it will be possible to generate an activation command of the pixel following the detection of the event e1 for an activation with the characteristics (typically a level of gray) carried by the asynchronous signal processed for the event e1. The same is true of events e2, e3, e4, and e5, the detections of which, in one embodiment, may each give rise to the generation of an activation command with respective characteristics. In a particular embodiment of the proposed method, an activation command will also be generated to perform a refresh activation in a predetermined time interval following the activation of the pixel upon detection of the event e2. This activation command may for example be generated if no event has been detected in the time interval [te2; te2 + Atp] of duration Atp from moment te2. Alternatively it may be generated at any time during the duration of the time interval [te2; te2 + Atp] of duration Atp from moment te2. FIG. 7c shows an example of a sequence of activation commands generated according to an embodiment of the proposed method applied to the sequence of events of FIG. 7a. The activation commands are represented in the figure by Dirac peaks of variable amplitude. It shows activation commands Ce, Ce2, Ce3, Ce4 and Ces, respectively corresponding to events e1, e2, e3, e4 and e5 of FIG. 7a, generated respectively at times tce1, tce2, tce3, tce4 and tce5. Each of the commands has information about the gray level at which the pixel is to be illuminated, noted in Figure g (tei), with i = {1, 2, 3, 4, 5}. In addition to the activation commands Ce; generated on detection of an event ei, a refresh activation command Cr, e2 is generated at time fe2 + Atp following the non-detection of a new event consecutive to that of the event e2 during a duration Atp. The refresh activation command Cr, e2 may be generated with the same activation characteristics, and for example the same gray level g (te2), as those included in the event activation command Ce2. As a variant, the refresh activation command Cr, e2 may be generated with activation characteristics determined according to those determined for the activation command on event Ce2. Refreshment of the pixels of the array may also be performed in a group, and routine and periodic refresh of array pixels at a predetermined refresh rate may be performed. The advantage of this embodiment is to avoid individualized management of each pixel with respect to refreshing. The refresh of the matrix is then repeated at a rate at least equal to the shortest refresh interval defined for a pixel. According to this variant, the refreshing of the pixel matrix is performed periodically, while remaining completely decorrelated and independent of the sequence of activation commands on event. The generation of the activation refresh command is independent of that of the event activation command, and the activation refresh rate is chosen so that the difference between an event activation of each pixel and the immediately subsequent refresh activation does not exceed a predetermined time interval Atp which, as discussed above, may be selected according to the retinal persistence of the human eye in the range of 40ms to 150ms. This "mixed" activation mode makes it possible to combine an event activation operating asynchronously with a periodic synchronization activation performed synchronously.
[0020] Thus, depending on the occurrence of the events for each pixel, the activation of refreshing the pixel may occur in a very short period of time following an activation on the pixel's event, or at the end of a predetermined duration following a event activation of the pixel defined as the maximum distance between two activations of the pixel. This mixed mode makes it possible to lighten the management of the refresh of the activation, while limiting thanks to the activation on asynchronous event for each pixel, the activation of the information of redundancy, since the asynchronous activation makes it possible to choose low refresh rates compared to current systems. With reference to FIG. 1, each generation of event activation control may, for example, result in the data processing unit 102a registering activation characteristics 103b with the activation characteristics relating to the command generated, so that these characteristics can be retrieved by the unit 102 for managing the refreshing activation. Figure 7d schematically shows a sequence of activation commands generated according to another embodiment of the proposed method applied to the sequence of events of Figures 7a and 7b. With reference to FIG. 7b, the event e4 is detected with a time spacing te4 - te3 less than the Atmin time interval defined as the minimum time interval separating two successive pixel activation commands. The activation command sequence of FIG. 7d differs from the sequence illustrated in FIG. 7c in that no activation command is generated for the event e4, since the time difference between the instant detection of this event and the instant of detection of the immediately preceding event is less than a predefined threshold Atmin. The management of the separation time interval for each pixel can be performed by means of a timer, set at the value Atmin defined for the pixel and triggered at each generation of event activation control for the pixel. For example, an event detected while this timer has not expired for the pixel may lead to ignore the event and not generate a corresponding event activation command. The activation characteristics carried by the asynchronous information for this ignored event can also be ignored. As a variant, provision may be made for an implementation in which, even if no event activation command is generated, the activation characteristics corresponding to the event are recorded in memory 103b for later use, for example for the next update of the io pixel refresh. According to the embodiment, a single timer can be used for the management of the refresh interval and that of the separation interval. This timer can be triggered at each pixel activation command generation by logging an indicator to distinguish the event activation commands from the refresh enable commands. FIG. 8 shows a scene display system 800 comprising an acquisition subsystem 801 and a display subsystem 802. The acquisition subsystem 801 includes a suitable light acquisition device generating a signal carrying asynchronous information representing, for each pixel of a matrix of pixels, events respectively corresponding to light variations in the scene. It is operatively coupled to visualization subsystem 802, which includes a display device as described above including an input interface configured to receive a signal carrying asynchronous information. The signal carrying asynchronous information transiting on the coupling interface between the subsystems 801 and 802 may have different shapes, or formats, corresponding to different embodiments of the system. The coupling interface may also be provided according to different standard formats, such as, for example, the USB format. The proposed system is not limited to a particular asynchronous information format, vector of this information (for example, an asynchronous signal bearing information representative of an event flow), or a specific format of coupling interface between the subsystems 801 and 802. The asynchronous signal received by the subsystem 802 carries information representative of time events corresponding to light variations in a scene. For this purpose, the subsystem 802 may comprise an input interface configured to receive a signal produced by the acquisition subsystem 801. FIG. 9a shows an example of implementation of the acquisition subsystem 113. 801 according to a particular embodiment of the proposed visualization system. FIG. 9a shows an acquisition subsystem 801 comprising a light acquisition device 200 comprising an event-based asynchronous vision sensor (201) placed facing a scene and receiving the luminous flux of the scene through acquisition optics (202). The sensor (201) may comprise a group of photosensitive elements organized in a matrix of pixels, so that each pixel of the matrix corresponds to a photosensitive element of the sensor. For each pixel of the array, the device (200) generates an event-based asynchronous signal sequence from the light changes felt by the pixel in the scene appearing in the field of view of the device (200). Each pixel corresponding to a photosensitive element, thus produces time events respectively corresponding to light variations in the scene. The sensor 201 therefore does not produce video frames constituted by the pixel array corresponding to the photosensitive elements of the sensor at a predetermined sampling frequency. It reacts for each pixel of the matrix to events corresponding to light variations for the pixel. Conversely, it does not produce information for a pixel if no event occurs for that pixel. In particular, it does not perform systematic capture of the luminous intensity of the pixels of the matrix. Thus, the events to which it reacts are asynchronous, and do not depend on a video frame acquisition rate. This makes it possible to greatly reduce, if not eliminate, the redundancies created by the acquisition of video frames at a fixed rate that does not take into account the absence of a change in the information carried by a pixel of a frame to the other. A processing unit (203) processes the information from the sensor (201) and representative of the events generated asynchronously by the different pixels, to generate an asynchronous signal carrying this information.
[0021] An example of the principle of acquisition by this asynchronous sensor is illustrated in FIGS. 9b-9d. According to this example, the information consists of a succession of instants, noted tk (k = 0, 1, 2, ...) at which an activation threshold Q is reached. The sensor 201 is therefore provided with a variation detector which, for each pixel, measures and records the luminous intensity of the pixel when this intensity has varied 113 beyond a threshold Q. FIG. 9b shows an example of a profile. light intensity P1 seen by a pixel of the matrix of the asynchronous vision sensor. Whenever this intensity increases by an amount equal to the activation threshold Q in comparison with what it was at the instant tk, a new event is identified and a positive line 15 (level +1 in FIG. 9c) is emitted corresponding to the moment of exceeding the differential threshold Q, denoted tk + 1. Symmetrically, each time the intensity of the pixel decreases by the quantity Q in comparison with what it was at the moment tk, a new event is identified and a negative line (level -1 in FIG. 9c) is emitted corresponding at the moment of exceeding the differential threshold Q, denoted Vo. FIG. 9d shows the intensity profile P2 which can be reconstructed as an approximation of the profile P1 by time integration of the asynchronous signal of FIG. 9c. The activation threshold Q may be fixed, as in the case of FIGS. 9b-d, or adaptive depending on the light intensity. For example, the threshold ± 0 may be compared to changes in the logarithm of light intensity for generating an ± 1 event. The class of asynchronous photosensitive sensors generating events from light intensity variations is referred to by the acronym DVS, for "Dynamic Vision Sensor". For example, the sensor 201 may be a DVS sensor of the kind described in "A 128x128 120 dB 15 ps Latency Asynchronous Temporal Contrast Vision Sensor", P. Lichtsteiner, et al., IEEE Journal of Solid-State Circuits, Flight . 43, No. 2, February 2008, pp. 566-576, or in US patent application 2008/0135731 A1. Another generation of asynchronous photosensitive sensors makes it possible to generate asynchronous information indicating events as well as an associated characteristic, for example a gray level. The article by Posch, C., Matolin, D., and Wohlgenannt, R. (2011) titled "A qvga 143 db dynamic range frame-free pwm image sensor with lossless pixel-level compression video and time-domain cds", and published in the IEEE Journal of Solid State Circuits, 46, pp. 259-275. doi: 10.1109 / JSSC.2010.2085952, provides a description of examples of events encoded by gray levels. The asynchronous information for each pixel is again a succession of pulses or lines positioned in time at times tk depending on the light profile for the pixel. Each event may, for example, correspond to two successive pulses, the first indicating the instant of the event and the second making it possible to determine a gray level for the pixel as a function of the time difference between the two pulses. The information corresponding to an event for a pixel thus comprises a first information relating to an instant of occurrence of the event, and a second piece of information relating to a luminous characteristic (gray level) for the pixel at this instant. The acquisition subsystem 801 can incorporate in one embodiment a generation-based asynchronous vision sensor 201, of new generation, sometimes referred to as ATIS, for "Asynchronous, zo Time-Based Image". Sensor ". The acquisition subsystem and the ATIS sensor that it incorporates can be, for example, of the type described in the article by C. Posch et al., Entitled "An Asynchronous Time-based Image Sensor" (IEEE International Symposium). on Circuits and Systems, 2008, pages 2130-2133), or of the type described in the article by C. Posch et al., entitled "A QVGA 143 25 dB dynamic range frame-free PWM image sensor with lossless pixel -level video compression and time-domain CDS "(46 (1): 259275, 2011). The dynamics of the retina (minimum duration between the action potentials) of the order of a few milliseconds can be suitably reproduced with a sensor of the DVS or ATIS type. The dynamic performance is in any case much greater than that achievable with a conventional video camera having a realistic sampling frequency. For example, a sensor of this type achieves microsecond time resolutions with a luminance range greater than 120 dB, which is much higher than a standard CMOS / CCD camera which typically has a range of luminance 35 of 60-70 dB.
[0022] In one embodiment, the processing unit 203 comprises a processor operatively coupled to a memory. The memory may contain software instructions which, when executed by the processor of the data processing unit, cause this unit to process the signals received from the sensor and generate the asynchronous information representing for each pixel events corresponding to light variations on the pixel according to the various methods described herein, and transmission of the asynchronous information on an output interface. The processing unit may be a component implementing a processor or a calculation unit for the generation of the asynchronous information according to the various methods described and the control of the asynchronous vision sensor of the device 200 within the subsystem 801. in addition, the processing unit, and in particular its processor and / or its memory means can be, separately or jointly, implemented in software form, as described above, in hardware form, as a specific integrated circuit application ( ASIC), or in the form of a combination of hardware and software elements, such as for example one or more software programs intended to be loaded and executed respectively on one or more FPGA (Field Programmable Gate Array) type components. They can be implemented, separately or jointly, in the form of an electronic circuit, or within one or more electronic components (in English, chip or chipset). With reference to FIGS. 5b, 5c, and 5d, the asynchronous vision sensor may be mounted on the support on which the projector is mounted. FIGS. 5b, 5c, and 5d show the exemplary embodiment in which the support is a pair of spectacles 150. The sensor 151 is mounted on the upper part 152 of the spectacle frame. Preferably, the sensor will be mounted centered on the upper part 152 of the glasses, so as to avoid parallax errors between the acquired scene and that reproduced by stimulation of photoreceptors of an eye by the projector 153.
[0023] In a variant, provision may be made to arrange the sensor on a surface of the spectacles, such as the inner or outer surface of one of the glasses of the spectacles 150. By "glass" is meant a portion of the glasses of surface form fixed under the upper part of the glasses, and which may be made of glass but also of any other material, not necessarily translucent. A pair of spectacles comprises two lenses, each lens having an inner surface and an outer surface. When the projector 153 is mounted on a first surface 158 of the glasses 150, the control unit of the device and the processing unit of the acquisition subsystem can be mounted on a second surface 159 of the glasses. These two units can be implemented on an electronic module 157 fixed on the inner surface of a glass of the glasses, the projector 153 being fixed on the inner surface of the other glass, as shown in Figures 5b and 5d.
[0024] Alternatively, an optoelectronic block consisting of the sensor, the control unit of the device and the processing unit of the acquisition subsystem may be mounted on a surface of one of the spectacle lenses, the projector being mounted on one surface of the other glass or on the other surface of the same glass. Although described through a number of detailed exemplary embodiments, the activation control method and the equipment for carrying out the method comprise various variants, modifications and improvements which will become apparent to the skilled person. the art, it being understood that these various variants, modifications and improvements are within the scope of the invention, as defined by the following claims.
[0025] In addition, various aspects and features described above may be implemented together, or separately, or substituted for each other, and all of the various combinations and sub-combinations of aspects and features are within the scope of the invention. 'invention. In addition, some of the systems and equipment described above may not incorporate all of the modules and features described for the preferred embodiments. The information and signals described in this document can be represented in a variety of technologies and techniques. For example, instructions, messages, data, commands, information, signals, bits and symbols may be represented by voltages, intensities, electromagnetic waves, or a combination thereof. Depending on the embodiment chosen, certain acts, actions, events, or functions of each of the methods described herein may be performed or occur in a different order from that in which they were described, or may be added, merged or not to be performed or not to occur, as the case may be. In addition, in certain embodiments, certain acts, actions or events are carried out or
权利要求:
Claims (9)
[0001]
REVENDICATIONS1. An image sequence display device in the form of a pixel array, comprising a control unit operably coupled to a projector, the control unit comprising an input interface configured to receive asynchronous information representing, for each pixel of the matrix, events concerning the pixel; and a processor configured to control the activation of each pixel of the array at times determined by respective events indicated by the asynchronous information for said pixel; wherein the projector is arranged on a support so as to illuminate photoreceptors of the eye during use of the device, and configured to project a luminous flux corresponding to the pixels activated by the control unit.
[0002]
2. Device according to claim 1, wherein the control unit is further configured to, after activation of a pixel of the matrix at a given time by an event indicated by the asynchronous information, repeat the activation command said pixel substantially at the same activation level at times defined by a refreshing sequence.
[0003]
The apparatus of claim 1 or 2, wherein the projector comprises a micro-mirror array, a micro-mirror matrix drive unit, a control input for receiving the pixel activation commands, and a optical input to receive a luminous flux.
[0004]
4. Device according to any one of the preceding claims, wherein the projector support is in the form of a pair of glasses, the projector being placed on a surface of the glasses.
[0005]
A scene display system, comprising a display subsystem operatively coupled to an acquisition subsystem, wherein: the display subsystem comprises a device according to any one of the claims 1 to 4; the acquisition subsystem comprises a sensor disposed facing the scene, operatively coupled to a processing unit configured to generate asynchronous information representing events for each pixel.
[0006]
The system of claim 5, wherein the sensor is a light sensor comprising a scene acquisition optics and an array of photosensitive elements.
[0007]
7. System according to claim 5 or 6, wherein the sensor is mounted on the projector support so that the captured scene substantially corresponds to the visual field of a user of the device.
[0008]
8. System according to any one of claims 5 to 7, wherein: - the projector support is in the form of a pair of glasses, - the projector is mounted on a first surface of the glasses; the sensor is mounted on the upper part of the spectacle frame; the control unit of the device and the processing unit of the acquisition subsystem are mounted on a second surface of the glasses.
[0009]
9. System according to any one of claims 5 to 7, wherein: - the projector support is in the form of a pair of glasses, - the projector is mounted on a first surface of the glasses; the sensor, the control unit of the device and the processing unit of the acquisition subsystem are mounted on a second surface of the glasses.
类似技术:
公开号 | 公开日 | 专利标题
EP3123233B1|2020-12-02|Device for displaying an image sequence and system for displaying a scene
JP2013535931A|2013-09-12|Reduced image acquisition time for compression imaging devices
US11212512B2|2021-12-28|System and method of imaging using multiple illumination pulses
EP3236841B1|2019-06-26|Imaging device and method
EP1351498A2|2003-10-08|Real time processing method of an image signal
FR2735891A1|1996-12-27|SYSTEM FOR READING DATA SYMBOLS
EP2082279A1|2009-07-29|Opto-electronic display assembly
FR2512302A1|1983-03-04|INFRARED OBSERVATION DEVICE
EP2804011B1|2016-08-31|Laser illumination device with built-in shutter
FR2864740A1|2005-07-01|Anti-glaring equipment e.g. active sun-visor, has computer controlling filter for alternatively displaying captured and filtered images, where filter is controlled by reference image with level adjusted by image intensities analysis
EP3627210A1|2020-03-25|System for viewing in virtual or augmented reality with eye image sensor, and associated method
EP3123462B1|2020-10-07|Display control method and device for implementing said method
FR2703868A1|1994-10-14|Use of a video camera and strobe recording device for phenomena.
FR2889639A1|2007-02-09|FAST CAMERA INTELLIGENCE
US11205069B1|2021-12-21|Hybrid cornea and pupil tracking
EP2804032A1|2014-11-19|Quick-shutter optical device
US11108977B1|2021-08-31|Dual wavelength eye imaging
Liu et al.2021|Improved Flying-Spot Projector Design for Dense Dynamic Light-Transport Capture
US11194160B1|2021-12-07|High frame rate reconstruction with N-tap camera sensor
CA3146689A1|2021-01-14|Digital nocturnal vision apparatus with adjustable image acquisition speed
WO2007144499A1|2007-12-21|System and method for formulating images by juxtaposition
FR2790562A1|2000-09-08|Optical device for mixing images applicable to visor or a helmet, e.g. viewfinder helmet for pilots or night vision helmet
FR3006490A1|2014-12-05|METHOD AND SYSTEM FOR REALIZING LIGHT EFFECTS IN A CROWD
FR2773658A1|1999-07-16|Ultra-rapid image collection system for creating digitized image from CCD device
同族专利:
公开号 | 公开日
US20170111619A1|2017-04-20|
FR3019317B1|2016-05-27|
US10154235B2|2018-12-11|
WO2015145017A1|2015-10-01|
EP3123233A1|2017-02-01|
ES2856208T3|2021-09-27|
EP3123233B1|2020-12-02|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
US20140009591A1|2006-12-19|2014-01-09|Imi Intelligent Medical Implants Ag|Vision aid with three-dimensional image acquisition|
FR2975251A1|2011-05-12|2012-11-16|Univ Paris Curie|METHOD AND DEVICE FOR CONTROLLING A VISION ASSIST DEVICE|
DE19707046A1|1997-02-21|1998-08-27|Rolf Prof Dr Ing Eckmiller|Learnable "Active Vision" implant encoder|
US7641342B2|2000-10-07|2010-01-05|Metaio Gmbh|Information system and method for providing information using a holographic element|
DE10103922A1|2001-01-30|2002-08-01|Physoptics Opto Electronic Gmb|Interactive data viewing and operating system|
WO2006128315A1|2005-06-03|2006-12-07|Universität Zürich|Photoarray for detecting time-dependent image data|
JP5899884B2|2011-12-12|2016-04-06|セイコーエプソン株式会社|Polarization conversion device and projector system|WO2018183000A1|2017-03-30|2018-10-04|Schepens Eye Research Institute|Motion parallax in object recognition|
US10237481B2|2017-04-18|2019-03-19|Facebook Technologies, Llc|Event camera for generation of event-based images|
EP3393122A1|2017-04-18|2018-10-24|Oculus VR, LLC|Event camera|
WO2019137973A1|2018-01-11|2019-07-18|Gensight Biologics|Method and device for processing asynchronous signals generated by an event-based light sensor|
US10718957B2|2018-02-01|2020-07-21|Tectus Corporation|Eye-mounted device including a femtocamera and femtoprojector|
EP3733139A1|2019-05-02|2020-11-04|Gensight Biologics|Viewing apparatus and method for projecting a light signal|
CN112399032A|2019-08-13|2021-02-23|天津大学青岛海洋技术研究院|Optical flow acquisition method of pulse type image sensor based on detector|
法律状态:
2015-01-29| PLFP| Fee payment|Year of fee payment: 2 |
2016-01-28| PLFP| Fee payment|Year of fee payment: 3 |
2017-12-29| ST| Notification of lapse|Effective date: 20171130 |
优先权:
申请号 | 申请日 | 专利标题
FR1452558A|FR3019317B1|2014-03-26|2014-03-26|DEVICE FOR VISUALIZING AN IMAGE SEQUENCE AND SYSTEM FOR VISUALIZING A SCENE|FR1452558A| FR3019317B1|2014-03-26|2014-03-26|DEVICE FOR VISUALIZING AN IMAGE SEQUENCE AND SYSTEM FOR VISUALIZING A SCENE|
PCT/FR2015/050615| WO2015145017A1|2014-03-26|2015-03-12|Device for displaying an image sequence and system for displaying a scene|
US15/128,996| US10154235B2|2014-03-26|2015-03-12|Device for displaying an image sequence and system for displaying a scene|
ES15714867T| ES2856208T3|2014-03-26|2015-03-12|Device for displaying a sequence of images and system for displaying a scene|
EP15714867.7A| EP3123233B1|2014-03-26|2015-03-12|Device for displaying an image sequence and system for displaying a scene|
[返回顶部]