![]() IMAGING DEVICE AND IMAGING METHOD
专利摘要:
An imaging device includes a light source arranged to generate an optical signal, an optical medium coupled to the light source, and arranged to project a substantially constant light intensity excitation signal from the light source to a light source. body to be observed when using the device, and an asynchronous camera coupled to the optical medium and configured to generate a signal comprising, for each pixel of a first pixel array, a signal sequence representing asynchronous events corresponding to variations the light backscattered by the body to observe for the pixel. 公开号:FR3031035A1 申请号:FR1463335 申请日:2014-12-24 公开日:2016-07-01 发明作者:Guillaume Chenegros;Nicolas Libert;Ryad Benosman;Jacques Durenteau;Anatole Harrois;Serge Picaud;Jose-Alain Sahel 申请人:Centre National de la Recherche Scientifique CNRS;Universite Pierre et Marie Curie Paris 6;Institut National de la Sante et de la Recherche Medicale INSERM; IPC主号:
专利说明:
[0001] The present invention relates to imaging devices, and more particularly to imaging devices for the study of blood microcirculation. BACKGROUND OF THE INVENTION [0002] Various video-microscopic imaging techniques have been developed for the monitoring of the microcirculation and more generally for the study of the circulation of the blood vessels of diameter of the order of the pm. These techniques make it possible to visualize, for example, the displacement of blood cells, and in particular that of red blood cells in the blood vessels, in particular the blood capillaries. The OPS technique for Orthogonal Polarization Spectral uses an incident light beam polarized linearly in a plane, which is projected onto the subject through a beam splitter. The analysis is carried out on the reflected light whose polarization is modified according to the degree of depth of penetration into the tissues of the incident light before being reflected. A large part of the reflected light is derived from the reflection by the upper layers of the observed tissues, and retains its initial polarization, while the incident light that penetrates deeper into the observed tissues, thus undergoes more "scattering" and loses its initial polarization. The analysis device comprises a camera in front of the lens of which is disposed an orthogonal polarizer which blocks the portion of reflected light which has retained its polarization, to generate an image from the reflected light unfiltered by the polarizer. SDF, for Sidestream Dark Field, uses a dark field illumination, provided by a circular prismatic lens arranged around the lens of a camera, to generate a halo of light around it. and beyond the focal point of the lens. The camera is typically located above a central area of the light halo so as to capture the backscattered light. This type of illumination, which allows a greater depth of observation and three-dimensional imaging of visualized tissues, seems well suited for the analysis of blood microcirculation. An example of a functional microcirculation imaging device using the principle of dark field microscopy is described in the international application published under the number WO 2005/032361 A2. [0003] Existing functional microcirculation imaging devices, however, do not allow the observation of rapid phenomena such as the circulation of red blood cells. They also have limitations for real-time observation and analysis of the microcirculation. There is thus a need for imaging devices not having the disadvantages of the conventional devices discussed above. In particular, a first need is to provide imaging devices that allow observation of rapid phenomena related to blood microcirculation, such as the displacement of red blood cells. Another need is to provide imaging devices that allow real-time processing of generated information to analyze blood microcirculation. According to a first aspect, there is provided an imaging device comprising at least one light source arranged to generate an optical signal, an optical medium coupled to the light source and arranged to project a light intensity excitation signal. substantially constant from the light source to a body to be observed when using the device, and a first asynchronous camera coupled to the optical medium and configured to generate a signal comprising, for each pixel of a first pixel array, a signal sequence representing asynchronous events corresponding to variations of the light backscattered by the body to be observed for the pixel. [0004] The use of an asynchronous camera for imaging fast phenomena, in particular biological phenomena, has many advantages. These result notably from the fact that the video acquisition is not controlled by a clock, but by the events occurring for each pixel of the sensor in the scene opposite which is placed the sensor. Conventional cameras are in contrast by control and synchronization signals generated by the sensor from a clock frequency, signals which have no connection with the source of the visual information. This radically different mode of operation of the asynchronous cameras makes it possible to achieve much higher temporal resolution values than those of conventional cameras, which makes it possible to implement imaging systems having high temporal and spatial resolutions with sensitivity. sufficient to open a new field of possibilities in the observation and analysis of rapid biological phenomena such as microcirculation. In addition, the methods of acquiring or synthesizing a frame image sequence have the drawback of producing data with high redundancy, due to the fact that each frame represents a large number of 3031035 pixels. an image if not an entire image, and that all of these pixels for which the information does not change from one image to another generates redundancies in the data representing the image sequence. This redundancy can only be partially suppressed by a compression encoding of a conventional video signal. Conversely, the asynchronous signals generated by the asynchronous cameras make it possible to obtain a very compact representation of data relating to a sequence of images, because these data, representing events for one pixel (and not for all pixels of a matrix or a large number of them), are not redundant from one image to another. [0005] In one embodiment, the imaging device may include a second asynchronous camera coupled to the optical medium and configured to generate a signal comprising, for each pixel of a second pixel array, a signal sequence representing corresponding asynchronous events. to variations in backscattered light by the body to be observed for the pixel, and wherein a first optical filter is arranged to filter the backscattered light picked up by the first asynchronous camera. The use of a second asynchronous camera allows certain types of analysis, such as spectral analysis, which uses the properties of backscattered light acquired for the same observation for different wavelengths. Alternatively, the first optical filter may be used to filter the backscattered light picked up by the first asynchronous camera, and a second optical filter may further be arranged to filter the backscattered light picked up by the second asynchronous camera. This allows the light to be selected according to its wavelength so that each asynchronous camera receives substantially simultaneously the information from the body being studied. In one embodiment, the imaging device may include a third asynchronous camera coupled to the optical medium and configured to generate a signal comprising, for each pixel of a third pixel array, a signal sequence representing corresponding asynchronous events. to variations of the backscattered light by the body to be observed for the pixel, and wherein a third optical filter is arranged to filter the backscattered light picked up by the third asynchronous camera. In one embodiment, the light source may be an electroluminescent diode, for example stimulated by a DC voltage, or a laser source. [0006] In one embodiment, the device may include a plurality of light sources arranged on the optical medium to generate substantially constant light intensity excitation signals of respective luminous intensity. The luminous intensity may be different from one another. light source to the other, and the embodiment in which the light sources are of substantially constant light intensity is a non-limiting embodiment of the proposed device. In addition, the light sources may be arranged on the optical medium so as to form a light ring. [0007] In one embodiment, the one or more asynchronous cameras may be arranged on the support located above an area not illuminated by the light source (s). In another aspect, a microcirculation imaging system comprising a device according to the various embodiments described herein is provided. In another aspect, there is provided an imaging method comprising: generating an optical signal of substantially constant light intensity; projecting a light excitation signal from the optical signal to a body to be observed; and generating a signal comprising, for each pixel of a first pixel array, a signal sequence representing asynchronous events corresponding to variations of the light backscattered by the body to be observed for the pixel. Other features and advantages of the present invention will become apparent from the following description of nonlimiting exemplary embodiments, with reference to the accompanying drawings, in which: FIG. 1 is a block diagram of an imaging device according to an embodiment of the proposed device; FIG. 2 is a block diagram of an asynchronous camera of an imaging device according to an embodiment of the proposed device; Fig. 3a is a diagram showing an exemplary light intensity profile at a pixel of an asynchronous sensor; FIG. 3b shows an exemplary signal delivered by the asynchronous sensor in response to the intensity profile of FIG. 3a; Figure 3c illustrates the reconstruction of the intensity profile from the signal of Figure 3b; Figures 4a-4b are diagrams similar to those of Figures 3a and 3b illustrating a light acquisition mode for use in another example of the method; Fig. 5 is a block diagram of an imaging device according to an embodiment of the proposed device; Fig. 6 is a block diagram of an imaging device portion 5 according to an embodiment of the proposed device; Figure 7a is a block diagram of an imaging device according to an embodiment of the proposed device; Figure 7b is a block diagram of portions of the imaging device of Figure 7a; Fig. 8 is a block diagram of a multicamera imaging device according to an embodiment of the proposed device; Figures 9a and 9b are block diagrams of portions of multi-camera imaging devices according to different embodiments of the proposed device; Fig. 10 is a diagram illustrating the proposed imaging method according to one embodiment. In the following detailed description of embodiments of the invention, many specific details are presented to provide a more complete understanding. Nevertheless, those skilled in the art can appreciate that embodiments can be practiced without these specific details. In other cases, well-known features are not described in detail to avoid unnecessarily complicating the description. By "body" is meant herein a human or animal tissue, for example in the context of the use of the proposed device for imaging the microcirculation, or any object that it is desired to observe using the device proposed imaging or imaging process proposed. The invention will be described hereinafter in the nonlimiting context of an asynchronous information representing, for a pixel of a matrix of pixels, events corresponding to light variations for the pixel. The devices 30 and systems proposed are not, however, limited to this particular embodiment, the events concerning the pixel may according to the embodiment correspond to variations of light for the pixel, to the detection of a form of interest or to the detection of a primitive, and more generally to any type of asynchronous information for the pixel. [0008] Figure 1 shows a diagram of an imaging system according to an embodiment. The imaging system (1) illustrated comprises a light source (2) arranged to emit an optical signal (3). The light source (2) is coupled to an optical medium (4) arranged to project a substantially constant optical intensity excitation signal (5) from the signal (3) transmitted by the source towards the body (6). ) to be analyzed when using the system (1). The optical medium (4) is further arranged to receive the backscattered light (7) through the body (6) to be analyzed. In one embodiment, the backscattered light may undergo different treatments depending on the type of post-imaging treatments (OPF treatments, SDF treatments, etc.). The optical medium (4) is further optically coupled with an asynchronous vision sensor (8), so that the asynchronous vision sensor (8) can react to events carried by the backscattered light signal (7). . The asynchronous vision sensor (8) is provided with an interface (9) for outputting an asynchronous signal, on which different processes can be carried out for the intended application. The light source (2) may for example be of the electroluminescent diode type fed continuously by an electrical source, a laser source, or an incoherent light source. In one embodiment, a focusing system (not shown zo in the figure) is positioned in the optical medium (4) between the light source (2) and the body (6) to be analyzed in order to focus the optical signal. excitation (5) projected on the latter. The focal length of the focusing system, which may for example comprise a lens, will in practice be chosen according to the desired analysis resolution and the light source used. [0009] The excitation optical signal (5) is partly absorbed by the body (6) to be analyzed, which backscatters light in the form of a backscattered light signal (7). The optical medium (4) is arranged to receive this backscattered light signal (7), and transport it after possible optical treatments to the asynchronous vision sensor (8). [0010] Figure 2 schematically illustrates an asynchronous camera according to an embodiment of the imaging system. The asynchronous camera (10) is capable of generating a signal bearing asynchronous information representing, for each pixel of a matrix of pixels, events respectively corresponding to variations of light sensed by the camera. Depending on the intended application, the asynchronous camera (10) may be operatively coupled to a visualization subsystem and / or to a processing and analysis subsystem, each subsystem including input interface configured to receive a signal carrying asynchronous information. The signal conveying asynchronous information passing through the subsystem coupling interface may have different shapes, or formats, corresponding to different embodiments of the system. The output interface of the asynchronous camera (10) can also be provided according to different standard formats, such as, for example, the USB format. The proposed system is not limited to a particular asynchronous information format, a vector of this information (for example, an asynchronous signal carrying information representative of an event flow), or a specific format of information. output interface of the asynchronous camera (10). In one embodiment, the asynchronous signal generated at the output of the asynchronous camera (10) carries information representative of time events corresponding to light variations picked up by the camera (10). With reference to FIG. 2, the asynchronous camera (10) comprising a light acquisition device (11) comprising an event-based asynchronous vision sensor (13) placed opposite a scene to be observed and receiving the stream light of the scene through an acquisition optics (12). [0011] The acquisition optics (12) may comprise a lens whose characteristics (in particular the focal length and the diameter) are chosen adapted to the characteristics of the sensor (13). The sensor (13) may comprise a group of photosensitive elements organized in a matrix of pixels, so that each pixel of the matrix corresponds to a photosensitive element of the sensor. For each pixel of the array, the device (11) generates an event-based asynchronous signal sequence from the light variations perceived by the pixel in the scene appearing in the field of view of the device (11). Each pixel corresponding to a photosensitive element, thus produces time events respectively corresponding to light variations. [0012] The sensor (13) therefore does not produce video frames constituted by the pixel array corresponding to the photosensitive elements of the sensor at a predetermined sampling frequency. It reacts for each pixel of the matrix to events corresponding to light variations for the pixel. On the other hand, it does not produce information for a pixel if no event occurs for this pixel. In particular, it does not perform a systematic capture of the luminous intensity of the pixels of the matrix. Thus, the events to which it reacts are asynchronous, and do not depend on a video frame acquisition rate. The camera (10) is thus asynchronous in particular in that it does not require a time reference defining a rate of acquisition of the light level information perceived by the pixels of the matrix. Each pixel sends its information asynchronously with respect to the other pixels of the matrix. This makes it possible to greatly reduce, if not eliminate, the redundancies created by the acquisition of video frames at a fixed rate that does not take into account the absence of a change in the information carried by a pixel of a frame. to the other. Conversely, a conventional camera will be driven by a set of time synchronization and control signals independent of the source of the visual information, such as for example a pixel acquisition clock signal (typically operating at one a rhythm of a few MHz), whose rhythm will define a frequency at which the acquisition of a set of pixels of the matrix will be performed. The temporal resolution, corresponding to the minimum time interval during which the camera can detect a change, will therefore be much lower for a conventional camera (for example of the CCD or CMOS type) (of the order of 20 ms) than that of an asynchronous camera (of the order of ps). A processing unit (14) processes the information from the sensor (13) and representative of the events generated asynchronously by the different pixels, to generate an asynchronous signal (15) carrying this information. In one embodiment, the sensor (13) independently detects for each pixel the relative changes, and asynchronously and individually initiates for the pixel a measure of exposure value or gray level when - and immediately after - a change in brightness of a predetermined amplitude occurs in the field of view of the pixel. In general, the sensor (13) generates respective asynchronous signal sequences which represent events relating to the pixels of the array. In a particular embodiment, each asynchronous signal sequence indicates, or reports, events relating to a pixel, independently of the other signal sequences. For example, it may include data that identifies events with their respective characteristics. [0013] In one embodiment, the processing unit (14) comprises a processor operatively coupled to a memory. The memory may contain software instructions which, when executed by the processor of the data processing unit, cause this unit to process the signals received from the sensor and generate the asynchronous information representing for each pixel events corresponding to light variations relating to the pixel, and transmitting the asynchronous information on an output interface. The processing unit may be a component implementing a processor or a computing unit for the generation of the asynchronous information according to the various methods described and the control of the asynchronous vision sensor of the device (11) within the asynchronous camera (10). In addition, the processing unit, and in particular its processor and / or its memory means can be, separately or jointly, implemented in software form, as described above, in hardware form, as a specific integrated circuit. application (ASIC), or in the form of a combination of hardware and software elements, such as for example one or more software programs intended to be loaded and executed respectively on one or more FPGA components (Field Programmable Gate Array). They can be implemented, separately or jointly, in the form of an electronic circuit, or in one or more electronic components (in English, chip or chipset). An example of asynchronous sensor acquisition principle is illustrated in FIGS. 3a-3c. According to this example, the information consists of a succession of instants, noted tk (k = 0, 1, 2, ...) at which an activation threshold Q is reached. The sensor (13) is therefore provided with a variation detector which, for each pixel, measures and records the luminous intensity of the pixel when this intensity has varied beyond a threshold Q. FIG. 3a shows an example light intensity profile P1 seen by a pixel of the matrix of the asynchronous vision sensor. Whenever this intensity increases by an amount equal to the activation threshold Q in comparison with what it was at time tk, a new event is identified and a positive line (level +1 in FIG. 3b) is emitted corresponding to the moment of exceeding the differential threshold Q, denoted tk + 1. Symmetrically, each time the intensity of the pixel decreases by the quantity Q compared to what it was at the moment tk ', a new event is identified and a negative line (level -1 in FIG. 3b) is transmitted corresponding to the moment of exceeding the differential threshold Q, noted tk4i. The asynchronous signal sequence for the pixel then consists of a succession of pulses or lines, positive or negative, positioned in time at times tk depending on the light profile for the pixel. These lines can be represented mathematically by positive or negative Dirac peaks and each characterized by a transmission instant tk and a sign bit. The information corresponding to an event for a pixel, may comprise a first information relating to an instant of occurrence of the event, and a second piece of information relating to a luminous characteristic for the pixel at this instant. The shape of the asynchronous information for a pixel may be different from a succession of Dirac peaks, the events represented being able to have a temporal width or an amplitude or any waveform. The output signal of the asynchronous camera (10) corresponds to what is called an address-event representation (AER): each pixel corresponds to an event-based signal sequence ("event-based "). FIG. 3c shows the intensity profile P2 which can be reconstructed as an approximation of the profile P1 by time integration of the asynchronous signal of FIG. 3b. The activation threshold Q may be fixed, as in the case of FIGS. 3a 3c, or adaptive depending on the light intensity, as in the case of FIGS. 4a and 4b. For example, the threshold ± Q can be compared to changes in the logarithm of light intensity for the generation of an event ± 1. [0014] The class of asynchronous photosensitive sensors generating events from light intensity variations is referred to by the acronym DVS, for "Dynamic Vision Sensor". For example, the asynchronous vision sensor (13) may be a DVS sensor of the type described in "A 128x128 120 dB 15 ps Latency Asynchronous Temporal Contrast Vision Sensor", P. Lichtsteiner, et al., IEEE Journal of SolidState Circuits, Vol. 43, No. 2, February 2008, pp. 566-576, or in US patent application 2008/0135731 A1. Another generation of asynchronous photosensitive sensors makes it possible to generate asynchronous information indicating events as well as an associated characteristic, for example a gray level. [0015] 3 0 3 1 0 3 5 11 The article by Posch, C., Matolin, D., and Wohlgenannt, R. (2011) titled "A qvga 143 db dynamic range frame-free pwm image sensor with lossless pixel-level video compression and time-domain cds ", and published in the IEEE Journal of SolidState Circuits, 46, pages 259-275. doi: 10.1109 / JSSC.2010.2085952, provides a description of examples of events encoded by gray levels. The asynchronous information for each pixel is again a succession of pulses or lines positioned in time at times tic depending on the light profile for the pixel. Each event may, for example, correspond to two successive pulses, the first indicating the instant of the event and the second making it possible to determine a gray level for the pixel as a function of the time difference between the two pulses. The information corresponding to an event for a pixel thus comprises a first piece of information relating to an instant of occurrence of the event, and a second piece of information relating to a luminous characteristic (gray level) for the pixel at this instant. For example, the detection of an event for a given pixel may result in the generation by the sensor (13) of a first signed pulse identifying the event (for example positive event for an illumination of the increasing pixel, and negative event for an illumination of the decreasing pixel), and a second pulse for characterizing the amount of light detected: if the light variation is slow, the gap between the two pulses will be long, which will translate a small amount of light, and therefore a dark gray level. There will have been enough variation of light to trigger the detection of an event, but little light during this variation. Conversely, if the light variation is fast, the gap between the two pulses will be small, which will translate a large amount of light, and thus a low gray level. The pulse train generated by the sensor (13) may be transmitted to the processing unit (14), which will generate on this basis event information for the pixel, such as time difference information with the previous event for the pixel, of gray level associated with the event for the pixel, of position (for example in the form of coordinates (x, y) in the matrix of pixels) of the pixel for which the event has been detected, and sense of the light variation detected for the pixel (increasing or decreasing). This event information for each pixel will be multiplexed to generate an output signal (15) of the asynchronous camera (10). [0016] By way of example, the asynchronous camera (10) can incorporate in one embodiment a new-generation event-based asynchronous vision sensor (13), which is sometimes referred to as 'acronym ATIS, for' Asynchronous, Time-Based Image Sensor '. The asynchronous camera and the ATIS sensor it incorporates, for example, may be of the type described in the article by C. Posch et al., Entitled "An Asynchronous Time-based Image Sensor" (IEEE International Symposium on Circuits Systems, 2008, pages 2130-2133), or of the type described in the article by C. Posch et al., Entitled "A QVGA 143 dB dynamic range frame-free PWM image sensor with lossless pixel-level 10 video compression and time-domain CDS "(46 (1): 259275, 2011). Another example of an ATIS vision sensor that can be used in the proposed imaging system is described in the article by T. Delbruck, P. Lichsteiner, and C. Posch, entitled "A 128x128 120 dB 15ps latency asynchronous temporal contrast vision sensor" (IEEE Journal of Solid State Circuits, Vol 43, No. 2, Feb. 2008, 15 566-576). This CMOS sensor, which has a matrix of 128 x 128 pixels, allows to reach a dynamic higher than 120 dB for a temporal resolution of 15 ps. For an asynchronous camera, the temporal resolution can therefore be defined by the minimum time difference between two pulses that the camera can generate as a function of the variations in luminous intensity for a pixel. The examples mentioned above indicate that for an ATIS camera, the temporal resolution reached may be of the order of the ps, a significant gain compared to conventional cameras (operating with a frame clock or a pixel clock). [0017] This significant temporal resolution gain is used for the implementation of the proposed imaging systems that can offer unparalleled performance for the observation and analysis of dynamic phenomena. This is the case, for example, in the field of observation and analysis of blood microcirculation, where the use of asynchronous cameras in an imaging system can allow the measurement of physiological parameters of blood cells (and in particular, red blood cells - in English, RBC for Red Blood Cells) such as the speed, the trajectory, and the amount of light returned at different wavelengths. Similarly, the dynamic performance that can be achieved with a DVS or ATIS type sensor is much greater than that achievable with a conventional video camera having a realistic sampling rate. For example, a sensor of this type achieves microsecond time resolutions with a luminance range greater than 120 dB, which is much higher than a standard CMOS / CCD camera which typically has a range of luminance of 60-70 dB. Figure 5 shows an imaging system diagram using the dark field microscopy principle according to one embodiment. The imaging system (100) illustrated comprises an optical medium (104) that can be arranged in a hollow cylinder. On one end of the cylinder of the optical medium (104) are mounted one or more light-emitting diodes (102a-102d) arranged on the section of the cylinder so as to form an illumination ring of the body (106) to be analyzed by projecting a plurality of optical signals of respective substantially constant light intensities. [0018] These LEDs (102a-102d) forming a set of light sources could be replaced or supplemented by laser diodes in another embodiment. LED sources have the advantage of allowing generation of a broad-spectrum optical signal, useful property for example to perform a blood cell velocity analysis in a microcirculation analysis application. The signal emitted by a laser source will be more narrow spectrally. These light sources are thus arranged on the optical medium (104) to each project an excitation optical signal (105a - 105d) towards the body (106) to be analyzed during use of the system (101). [0019] In one embodiment, optical treatments may be performed on the signals emitted by the light sources (for example focusing, amplification, etc.) so that the excitation signal projected onto the body to be observed is indirectly derived from the light sources. . The optical medium (104) is further arranged to receive the backscattered light (107) through the body (106) to be analyzed within the hollow cylinder, this inner tube forming a dark area in which the backscattered light (107) is routed to the asynchronous camera (108). The asynchronous camera (108) is positioned at the opposite end of the optical support hollow cylinder (104), from where the light sources (102a-102d) are arranged, so that the backscattered light passes through its optics. (108a) and illuminate its vision sensor when using the system (101). The camera 3031035 14 (108) is further provided with an output interface (109) of an imaging signal on which different treatments can be performed for the intended application. This configuration of the optical medium (104) makes it possible to create a light ring which illuminates the tissues to be analyzed in the case of a dark field microcirculation analysis application, as well as a dark zone in which the backscattered light is routed, allowing better contrast when the system is positioned near the body to be analyzed, minimizing the glare associated with backscattering of the upper layers of the body studied. [0020] Indeed, the device makes it possible to project a large amount of light through the light ring while analyzing only the parts of the body studied which are not directly illuminated, by means of the analysis of the light. backscattered light from the central area of the illumination ring. In one embodiment, the light sources (102a-102d) are arranged to emit respectively red, green, and / or infrared light, depending on the post-treatments performed. For example, it is possible to configure the light sources so that a first source emits red light, a second source emits green light, and a third source emits infrared light. The light sources may therefore be configured to emit, together during the use of the device, light signals at different respective wavelengths. Indeed, it is observed that the amount of light absorbed by the tissues in the case of the analysis of the microcirculation depends on the wavelength emitted. For example, red light and infra-red light can be used to analyze blood oxygenation. [0021] Although FIG. 5 illustrates an imaging device scheme operating on the principle of dark field microscopy, the proposed device is not limited to a specific imaging technology, and could implement, in other embodiments, all types of imaging technologies, such as polarization analysis imaging (OPS) technology. [0022] In an embodiment illustrated in FIG. 6, the proposed imaging device will comprise at least 6 light sources, including 2 red light sources (200a, 200b), 2 green light sources (201a, 201b), and 2 infra-red light sources (202a, 202b), for example mounted on the end (203) of a hollow cylinder of an optical medium (204) of the device. [0023] Fig. 7a shows a variation of the device of Fig. 5, wherein one or more light sources (302) are optically coupled to the optical medium (304) of the device via optical fibers (302a). The optical medium is shaped as a hollow cylinder, on one end of which is mounted by mechanical coupling an asynchronous camera (308). The signal generated by the asynchronous camera (308) is produced on an output interface (309) of the camera (308), for example of the USB connection type. FIG. 7b is a partial schematic view of the device (301) of FIG. 7a, and shows an ATIS vision sensor (308b) and the acquisition optics (308a) of the ATIS camera (308) placed at the main focus of this acquisition optics (308a), for example to image a blood cell of an observed body (306) on a block of 3 x 3 pixels. FIG. 7b further shows the internal structure of the optical medium (304) in which are arranged one or more light guides coupled to optical fiber input connections or, according to the embodiment, one or more shaped ducts for introducing an optical fiber end (303), to bring incident light from the light source (s) (302). The optical medium (304) is thus optically coupled to the light source (302) and arranged to project a substantially constant light intensity excitation signal (305) (incident light Ii) from the light source (302). ) to a body (306) to be observed when using the device (300). The backscattered light I and Io (307) is fed through the inner portion of the optical medium (304) cylinder to the ATIS sensor (308b). Although FIGS. 7a and 7b illustrate an imaging device operating on the principle of dark field microscopy, the proposed device is not limited to a specific imaging technology, and could be used in other applications. embodiments of all types of imaging technology, such as polarization analysis imaging (OPS) technology. Fig. 8 shows an embodiment in which the imaging device comprises a plurality of ATIS cameras. The ATIS cameras of the device are spatially arranged to capture substantially the same light signal backscattered by the body to be observed for the pixel. FIG. 8 is a diagram illustrating the imaging device (401) operating according to the principle of dark field microscopy. The imaging system (401) illustrated comprises an optical medium (404) arranged in a hollow cylinder. On one of the ends of the optical medium cylinder (404) are mounted one or more light sources (402a-402d), for example light-emitting diodes and / or laser diodes, arranged on the 3031035 cylinder section. in order to form an illumination ring of the body (406) to be analyzed, by projecting a plurality of light excitation signals of substantially constant intensity. These light sources (402a-402d) are thus arranged on the optical medium (404) to each project an excitation optical signal (405a-405d) towards the body (406) to be analyzed when using the system ( 401). In one embodiment, optical treatments may be performed on the signals emitted by the light sources (for example focusing, amplification, etc.) so that the excitation signals projected onto the body to be observed are respectively derived indirectly from the light sources. (402a - 402d). [0024] The optical medium (404) is further arranged to receive the backscattered light (407) through the body (406) to be analyzed within the hollow cylinder, this inner tube forming a dark area in which the backscattered light (407) is routed to the asynchronous sensor block (408) when using the device. In the illustrated embodiment, the asynchronous sensor block (408) is positioned at the opposite end of the optically support hollow cylinder (404) from where the light sources (402a-402d) are arranged, so that the backscattered light (407) passes through an acquisition optics (408a) and illuminates the vision sensors of the asynchronous sensor block (408) when using the system (401). The asynchronous sensors 20 of the block (408) can therefore use the same acquisition optics (408a) in one embodiment. This embodiment is however not limiting, and each asynchronous camera of the device can have its own acquisition optics while being arranged to capture asynchronous events corresponding to variations of backscattered light (407) by the body to be observed. (406) for each pixel of its vision sensor. The asynchronous cameras are each further provided with an output interface (not shown in the figure) of an imaging signal, on which different treatments can be performed for the intended application. Although FIG. 8 illustrates a multi-camera imaging scheme operating on the principle of dark field microscopy, the proposed device is not limited to a specific imaging technology, and could implement, in other embodiments, all types of imaging technologies, such as OPS technology. Figures 9a and 9b illustrate an exemplary embodiment of blocks of two and three asynchronous sensors, respectively. With reference to FIG. 9a, the backscattered light (508a) through the observation body 301 passes through an asynchronous camera acquisition optics (501) of the proposed imaging device and produces a light beam (508b), which comes into view. partly pass through and partly reflect on a dichroic mirror (506) arranged to split the light beam (508b) from the acquisition optics (501) into two beams. The portion of the light beam (508b) not reflected by the dichroic mirror (506) is processed by an optical processing block (504), for example to perform an optical filtering of the light beam (508b), which outputs a light beam (508c) which illuminates a first asynchronous vision sensor (502). The portion (508d) of the light beam (508b) reflected by the dichroic mirror (506) is reflected on a mirror (507) arranged to redirect the light beam (508d) from the acquisition optics (501) and the dichroic mirror (506) to a second optical processing block assembly (505) and asynchronous vision sensor (503). The reflected beam (508d) is thus treated by an optical processing unit (505), for example to effect optical filtering of the light beam (508d), which generates at output a light beam (508e) which illuminates a second sensor asynchronous vision (503). These optical processing block and asynchronous vision sensor assemblies make it possible to select certain characteristics of the signal which illuminates each asynchronous vision sensor, while ensuring that each signal comes from the light backscattered by the body to be observed, so that each sensor observes the same scene. Thus, each event detected by the asynchronous cameras corresponds to the same body. For example, in the context of a microcirculation observation and analysis application, each event detected relative to the same blood cell at a given instant will be by each of the asynchronous sensors of the imaging device. The imaging device proposed in the asynchronous multi-camera embodiment is thus arranged to capture information on the same pixel (assuming a correspondence between the pixels of the matrices of the various asynchronous vision sensors) and at the same given instant, This makes it possible not to reduce the spatial resolution of the observation (as with a Bayer filter for example) while preserving the high temporal resolution obtained with asynchronous cameras. For example, the optical processing blocks of Fig. 9a may include an optical filter to allow only red light to pass through one and infrared light for the other, in order to acquire light data. imaging for determining a blood cell oxygenation rate in a microcirculation analysis application. More generally, it is possible to use the multi-camera embodiment of the proposed imaging device for spectral analyzes of the light backscattered by the body to be analyzed, using for each camera optical filters operating on a frequency band to be analyzed. . With reference to FIG. 9b, the asynchronous vision sensor block may comprise three sets of optical processing and asynchronous sensor. The backscattered light (608a) passes through an acquisition optics (601) that generates a beam (608b) partially reflected by a first dichroic mirror (606). The beam portion (608b) not reflected by the first dichroic mirror (606) passes through a first optical processing block (604) that outputs a beam (608c) that illuminates a first asynchronous vision sensor (602). The beam portion (608d) (608b) reflected by the first dichroic mirror (606) is itself partially reflected on a second dichroic mirror (607) to be redirected to a second optical processing block (605) which generates at the output a beam (608e) which illuminates a second asynchronous vision sensor (603). The beam portion (608f) (608d) not reflected by the second dichroic mirror (607) is reflected on a third mirror (611) to be redirected to a third optical processing block (610) which outputs a beam ( 608g) which illuminates a third asynchronous vision sensor (609). This embodiment of the imaging device with three asynchronous cameras makes it possible to carry out post-treatments requiring a parallel imaging capture of the body to be analyzed. [0025] Referring to Figures 9a and 9b, the illustrated dichroic mirrors are provided as an example of an optical component for dividing an incident beam into two beams. On the other hand, the arrangement of such optical components may vary depending on the spatial configuration of asynchronous vision sensors or asynchronous cameras. [0026] In another embodiment, the proposed imaging device can carry four cameras. Three light sources (red, green and infrared) illuminate the sample indirectly with a structure, for example annular. The light backscattered by the sample to be analyzed is then picked up by an optical system located above a dark zone for a device operating on the principle of dark field microscopy. The light is then distributed according to its wavelength on the 4 cameras, for example 3031035 19 by means of filters arranged on the path of the light before reaching the sensor of each camera: an asynchronous camera for the red, an asynchronous camera for infra-red, an asynchronous camera for green, and a conventional high spatial resolution camera (for example 4 M pixels for a temporal resolution of 300 frames per sec.) for green as well. Figure 10 illustrates the proposed imaging method (701) according to one embodiment. In this embodiment, an optical signal is generated (702) from which a light excitation signal of substantially constant light intensity is extracted. The light excitation signal may be directly derived from the optical signal, or indirectly, after certain treatments (focussing, frequency processing, polarization processing, etc.). The light excitation signal is projected (703) on the body to be observed, in order to capture the light backscattered by this body. The use of at least one asynchronous sensor makes it possible to generate (704) a signal comprising, for each pixel of the pixel array of the sensor, a signal sequence representing asynchronous events corresponding to variations of the light backscattered by the body to observe for the pixel. Although described through a number of detailed exemplary embodiments, the activation control method and the equipment for carrying out the method include various alternatives, modifications, and enhancements that will become apparent to humans. of the art, it being understood that these various variants, modifications and improvements are within the scope of the invention, as defined by the following claims. In addition, various aspects and features described above may be implemented together, or separately, or substituted for each other, and all of the various combinations and sub-combinations of aspects and features are within the scope of the invention. the invention. In addition, some of the systems and equipment described above may not incorporate all of the modules and features described for the preferred embodiments. The information and signals described herein may be represented in a variety of technologies and techniques. For example, instructions, messages, data, commands, information, signals, bits and symbols may be represented by voltages, intensities, electromagnetic waves or a combination thereof. Depending on the embodiment chosen, certain acts, actions, events or functions of each of the methods described herein may be performed or occur in a different order than that in which they have been described, or may be added, merged or not to be performed or not to occur, as the case may be. In addition, in some embodiments, certain acts, actions or events are performed or occur concurrently and not successively. 5
权利要求:
Claims (10) [0001] REVENDICATIONS1. An imaging device comprising: at least one light source arranged to generate an optical signal; an optical medium coupled to the light source and arranged to project a substantially constant light intensity excitation signal from the light source towards a body to be observed when using the device; and o a first asynchronous camera coupled to the optical medium and configured to generate a signal comprising, for each pixel of a first pixel array, a signal sequence representing asynchronous events corresponding to variations of light backscattered by the body to be observed for the pixel. [0002] 2. Device according to claim 1, comprising a second asynchronous camera coupled to the optical medium and configured to generate a signal comprising, for each pixel of a second pixel array, a signal sequence representing asynchronous events corresponding to variations in the backscattered light by the body to be observed for the pixel, and wherein a first optical filter is arranged to filter the backscattered light picked up by the first asynchronous camera. [0003] 3. Device according to claim 2, wherein a second optical filter is arranged to filter the backscattered light picked up by the second asynchronous camera. [0004] 4. Device according to claim 3, comprising a third asynchronous camera coupled to the optical medium and configured to generate a signal comprising, for each pixel of a third pixel array, a signal sequence representing asynchronous events corresponding to variations in the backscattered light by the body to be observed for the pixel, and wherein a third optical filter is arranged to filter the backscattered light picked up by the third asynchronous camera. 3031035 22 [0005] 5. Device according to any one of the preceding claims, wherein the light source is a light emitting diode or a laser source. [0006] 6. Device according to any one of the preceding claims, comprising a plurality of light sources arranged on the optical medium 5 for generating light excitation signals of respective substantially constant light intensities. [0007] 7. Device according to claim 6, wherein the light sources are further arranged on the optical medium so as to form a light ring. 10 [0008] 8. Device according to any one of the preceding claims, wherein the asynchronous camera is arranged on the support located above an area not illuminated by the light source. [0009] A microcirculation imaging system comprising a device according to any one of claims 1 to 8. [0010] An imaging method, comprising: generating an optical signal; projecting a luminous excitation signal of substantially constant light intensity coming from the optical signal towards a body to be observed; and o generating a signal comprising, for each pixel of a first pixel array, a signal sequence representing asynchronous events corresponding to variations of the light backscattered by the body to be observed for the pixel.
类似技术:
公开号 | 公开日 | 专利标题 EP3236841B1|2019-06-26|Imaging device and method BE1022488B1|2016-05-04|TIME-OF-FLIGHT VIEWING APPARATUS SYSTEM FR2985023A1|2013-06-28|SYSTEM FOR RECONSTRUCTING OPTICAL PROPERTIES OF A DIFFUSING MEDIUM, COMPRISING A PULSE RADIATION SOURCE AND AT LEAST TWO DETECTORS OF TWO DIFFERENT TYPES, AND ASSOCIATED RECONSTRUCTION METHOD EP1523270B1|2010-05-19|Method and equipment for fiber optic high-resolution, in particular confocal, fluorescence imaging FR2532756A1|1984-03-09|SYSTEM FOR AUTOMATIC OBSERVATION AND QUANTIFICATION OF PHENOMENA THAT CAN BE DETECTED BY FLUORESCENCE EP3123233B1|2020-12-02|Device for displaying an image sequence and system for displaying a scene EP3304214A1|2018-04-11|Method for observing a sample CA2366763C|2009-03-17|High resolution device for observing a body CA2529201A1|2005-01-13|Method and device for recording and processing images of an object such as a tooth US10126709B2|2018-11-13|Apparatus and method for performing in-line lens-free digital holography of an object EP1740974A1|2007-01-10|Method and system for measuring the speed of blood flow FR3057068A1|2018-04-06|SAMPLE OBSERVATION DEVICE AND SAMPLE OBSERVATION METHOD EP3295153A1|2018-03-21|Method of correcting a fluorescence image BE1019646A3|2012-09-04|INSPECTION SYSTEM AND METHOD FOR HIGH SPEED IMAGING. EP1579260A1|2005-09-28|Parallel confocal laser microscopy system based on vcsel technology FR3054037A1|2018-01-19|DEVICE FOR OBSERVING A SAMPLE FR3086399A1|2020-03-27|VIRTUAL OR AUGMENTED REALITY VISION SYSTEM WITH EYE IMAGE SENSOR, AND ASSOCIATED METHOD EP2005143B1|2015-10-14|Methylene blue based fibred fluorescence microscopy FR3086071A1|2020-03-20|METHOD, MICROSCOPE AND OFF-AXIS TRANSMISSION DIGITAL HOLOGRAPHIC IMAGING ASSEMBLY. FR2673794A1|1992-09-11|Defect correction device for imagery systems FR3062542A1|2018-08-03|APPARATUS AND METHOD FOR CONTRAST IMAGING OF TAVELURES FR3097337A1|2020-12-18|Acousto-optical imaging system FR3019365A1|2015-10-02|DISPLAY CONTROL METHOD AND DEVICE FOR IMPLEMENTING THE METHOD FR3056060A1|2018-03-16|CAMERA ADAPTED TO WORK CONTINUOUSLY IN A RADIOACTIVE ENVIRONMENT. WO2016008952A1|2016-01-21|Automatic trajectory-mapping system for optical microscope, based on a smart high-speed camera
同族专利:
公开号 | 公开日 FR3031035B1|2019-07-05| EP3236841B1|2019-06-26| ES2744912T3|2020-02-26| US10955654B2|2021-03-23| EP3236841A1|2017-11-01| US20180024343A1|2018-01-25| WO2016102819A1|2016-06-30|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20060184037A1|2004-11-30|2006-08-17|Can Ince|Pulsed lighting imaging systems and methods| CN101900875B|2010-06-04|2011-12-14|南开大学|High-magnification three-dimensional imaging microscope based on double-light source off-axis illumination and imaging method| CN103356174A|2013-07-11|2013-10-23|贾宏博|Device for high-speed two-photon microimaging and electrophysiology signal synchronous recording| CN103445764A|2013-09-04|2013-12-18|广州医软智能科技有限公司|Device and method for monitoring microcirculation imaging| US7248716B2|2001-07-06|2007-07-24|Palantyr Research, Llc|Imaging system, methodology, and applications employing reciprocal space optical design| CA2541297C|2003-10-03|2013-12-10|Academisch Medisch Centrum|System and method for imaging the reflectance of a substrate| WO2006128315A1|2005-06-03|2006-12-07|Universität Zürich|Photoarray for detecting time-dependent image data| US8644911B1|2006-06-30|2014-02-04|Hypermed Imaging, Inc.|OxyVu-1 hyperspectral tissue oxygenation measurement system| EP2515153A1|2011-04-20|2012-10-24|Braedius Scientific B.V.|A dark field microscopy device|US20170343476A1|2016-05-31|2017-11-30|Molecular Devices, Llc|Imaging system with oblique illumination| US10638124B2|2017-04-10|2020-04-28|Intel Corporation|Using dynamic vision sensors for motion detection in head mounted displays| CN107741637A|2017-11-24|2018-02-27|南京图思灵智能科技有限责任公司|A kind of binary channels tissue sample scanner and binary channels tissue sample digital imagery recurrence system| CN111164395A|2017-12-21|2020-05-15|索尼半导体解决方案公司|Apparatus and method| US11143879B2|2018-05-25|2021-10-12|Samsung Electronics Co., Ltd.|Semi-dense depth estimation from a dynamic vision sensorstereo pair and a pulsed speckle pattern projector|
法律状态:
2015-11-27| PLFP| Fee payment|Year of fee payment: 2 | 2016-07-01| PLSC| Search report ready|Effective date: 20160701 | 2016-11-25| PLFP| Fee payment|Year of fee payment: 3 | 2017-12-22| TQ| Partial transmission of property|Owner name: INSERM (INSTITUT NATIONAL DE LA SANTE ET DE LA, FR Effective date: 20171120 Owner name: UNIVERSITE PIERRE ET MARIE CURIE (PARIS 6), FR Effective date: 20171120 Owner name: ASSISTANCE PUBLIQUE-HOPITAUX DE PARIS (APHP), FR Effective date: 20171120 Owner name: UNIVERSITE PARIS-SUD, FR Effective date: 20171120 Owner name: CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE , FR Effective date: 20171120 | 2017-12-28| PLFP| Fee payment|Year of fee payment: 4 | 2018-12-19| PLFP| Fee payment|Year of fee payment: 5 | 2020-10-16| ST| Notification of lapse|Effective date: 20200906 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1463335A|FR3031035B1|2014-12-24|2014-12-24|IMAGING DEVICE AND IMAGING METHOD| FR1463335|2014-12-24|FR1463335A| FR3031035B1|2014-12-24|2014-12-24|IMAGING DEVICE AND IMAGING METHOD| EP15821142.5A| EP3236841B1|2014-12-24|2015-12-16|Imaging device and method| ES15821142T| ES2744912T3|2014-12-24|2015-12-16|Imaging device and imaging procedure| PCT/FR2015/053537| WO2016102819A1|2014-12-24|2015-12-16|Imaging device and method| US15/539,320| US10955654B2|2014-12-24|2015-12-16|Imaging device and method| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|