![]() METHOD OF CORRECTING A FLUORESCENCE IMAGE
专利摘要:
The invention relates to a method for correcting a fluorescence image of an object, and in particular a biological tissue that may comprise fluorescent agents. According to this method, the distance between a fluorescence probe, generating the fluorescence image and the examined object is measured. This distance is used to apply a correction function to the fluorescence image. One field of application of the invention is intraoperative fluorescence imaging for the diagnosis and monitoring of the evolution of pathologies, in particular cancers. 公开号:FR3036187A1 申请号:FR1554260 申请日:2015-05-12 公开日:2016-11-18 发明作者:Philippe Rizo;Anthony Daures 申请人:FLUOPTICS;Commissariat a lEnergie Atomique CEA;Commissariat a lEnergie Atomique et aux Energies Alternatives CEA; IPC主号:
专利说明:
[0001] FIELD OF THE INVENTION The field of the invention is imaging, and in particular intraoperative fluorescence imaging for diagnosing and monitoring the evolution of pathologies or treatments in the field. medical domain. PRIOR ART Fluorescence imaging is a technique for locating fluorescent markers in a human or animal body. One of the main applications is the localization of fluorescent markers, or fluorophores, the latter targeting cells of interest, for example cancer cells. The protocol consists of injecting these markers into the body before surgery, so that during the intervention, the practitioner is able to visualize the cancer cells using a fluorescence image. . Because it allows to acquire an image indicating the localization of the different cancerous areas, the intraoperative fluorescence imaging allows to obtain information hitherto inaccessible to the practitioner, and constitutes a useful complement, even an alternative, to the use of radioactive tracers. Another application is interventional assistance in plastic and reconstructive surgery, cardiovascular surgery, lymphatic surgery, or liver surgery, where fluorescence imaging allows visual monitoring of lymphatic drainage, perfusion or of the vascularization. The principle of fluorescence imaging is to illuminate a field of view using a light source in a spectral band of fluorophore excitation. Under the effect of this illumination, the fluorophores emit fluorescence radiation, in a spectral band of fluorescence. This radiation can be captured by a fluorescence probe, so as to form a fluorescence image on which appear the different fluorescent zones. It is then possible to acquire a visible image of the observed field, and to superimpose the fluorescence image on this visible image. [0002] The fluorescence probe may be a compact assembly, including the excitation light source, and be carried by a practitioner, being placed within a few tens of centimeters of a tissue being examined. However, depending on the position of the probe relative to the tissue, the intensity of the fluorescence images may vary, particularly depending on the distance between the probe and the tissue observed. Also, if the images make it possible to correctly locate the fluorescent zones, they are difficult to compare, from a quantitative point of view, with each other, except to force the practitioner to place the fluorescence probe at a fixed distance from the observed tissue. Some devices are based on this constraint and only allow a quantitative approach at the cost of maintaining the probe at a set distance from the observed tissue. [0003] This is a significant disadvantage. The document US8606350 aims to solve this problem by using a fluorescence probe capable of measuring the distance separating it from a biological tissue examined. It is then possible to correct the fluorescence image provided by this probe using this distance. The objective is to obtain quantitative fluorescence imaging, in particular to quantify and compare the intensity levels of the fluorescent zones observed. For that, one acquires a fluorescence image, and one carries out a correction of this image by multiplying it by the previously measured distance. But this approach is based on the assumption of an inversely proportional relationship between the intensity of the measured signal and the distance, an approach that does not completely overcome the influence of the distance on the fluorescence image. obtained. The inventors propose to improve the method described in the document US8606350, by proposing an improved correction method. SUMMARY OF THE INVENTION An object of the invention is a method of correcting a fluorescence image, comprising the following steps: illumination of an object with the aid of an excitation light source; detecting a fluorescence radiation by a fluorescence image sensor, the fluorescence radiation being emitted by the object under the effect of said illumination, - acquiring said fluorescence image by said fluorescence image sensor to from the detected fluorescence radiation, the method being characterized in that it comprises the application of a correction function to the fluorescence image, said correction function using an illumination image representative of the distribution space of an illumination produced on the object by the excitation light source. [0004] According to one embodiment, the method also comprises a step of measuring a distance between the excitation source and the object, said correction function being dependent on this distance, by using an illumination image associated with said measured distance. Preferably, the correction function is such that two illumination images, respectively associated with two different measured distances, are different from each other. The method may comprise the following characteristics, taken separately or according to the technically feasible combinations: Said correction function may perform a ratio, term by term, of the fluorescence image by said illumination image. 15 - The illumination image can be normalized. The fluorescence image is acquired according to an exposure time and the correction function is also able to normalize the fluorescence image with respect to a reference exposure time. The correction function takes into account the square of said measured distance as well as the square of a reference distance. The method comprises a step of selecting said reference distance as a function of the fluorescence image. The correction function comprises a product of said fluorescence image by a ratio between the square of said measured distance and the square of said reference distance. [0005] The distance between the excitation light source and the object is measured by means of a range finder capable of emitting a wave towards the object and detecting a wave reflected by said object, according to which said distance is estimated. The rangefinder can then be a laser rangefinder, an ultrasonic range finder or a microwave range finder. [0006] The distance between the excitation light source and the object is measured by an optical telemeter comprising a matrix photodetector capable of determining a plurality of distances between the excitation light source and a plurality of imaging elements, respectively. surface forming a surface of the object. In this case, the correction function may comprise a product, term term, of said fluorescence image by a ratio between the square of the distance measured between the excitation source and a surface element and the square of a reference distance. The distance between the excitation light source and the object is measured by an autofocus system, able to automatically set a focus distance of the fluorescence image sensor relative to the object. - The method comprises a step of acquiring a visible image of the object using a visible image sensor. Another object of the invention is a fluorescence image acquisition device comprising: an excitation light source, capable of illuminating an object in an excitation spectral band, a fluorescence image sensor capable of collecting fluorescence radiation emitted by said object, in a fluorescence spectral band, under the effect of the illumination by the excitation light source, the fluorescence image sensor 20 being able to acquire a fluorescence image from the fluorescence radiation collected, a rangefinder for measuring a distance between the excitation light source and the object, the rangefinder being able to emit an optical wave towards the object and to detect a wave optical reflected by said object, according to which said distance is measured, the acquisition device being characterized in that it comprises one or more of the following characteristics, taken separately or in combination: the fluorescence image sensor is centered around an optical axis and the rangefinder is able to emit said optical wave along said optical axis; The fluorescence image sensor comprises an optical focusing system and the range finder is configured to emit an optical wave, in the direction of the object, centered in this optical system; the device comprises a selector of a reference distance, for storing said reference distance in a memory; the rangefinder comprises a matrix photodetector capable of determining a plurality of distances respectively between the excitation source and a plurality of surface elements forming the surface of the object. the device comprises a processor capable of implementing the method of correcting said fluorescence image as previously described. Another object of the invention is a method of correcting a fluorescence image, comprising the following steps: - illumination of an object with the aid of an excitation light source, 15 - detection of a fluorescence radiation by an image sensor, the fluorescence radiation being emitted by the object under the effect of said illumination, acquisition of a fluorescence image by said image sensor from the detected fluorescence radiation, measurement from a distance between the excitation light source and the object, applying a correction function to the fluorescence image, said correction function performing a correction of the fluorescence image as a function of the square of the fluorescence image. said measured distance. FIGURES FIG. 1 shows a device according to a first embodiment of the invention. Figure 2 shows a variant of this device. Figure 3A shows an image of the illumination when the illumination source is placed at a distance of 10 cm from the screen. FIG. 3B represents a profile of the intensity of the pixels of the image 3A along a horizontal line passing through the center of this image. [0007] Figure 4A shows an image of illumination when the illumination source is placed at a distance of 20 cm from the screen. FIG. 4B represents a profile of the intensity of the pixels of the image 4A along a horizontal line passing through the center of this image. Figure 5A shows an image of illumination when the illumination source is placed at a distance of 30 cm from the screen. FIG. 5B shows a profile of the intensity of the pixels of the image 5A along a horizontal line passing through the center of this image. Figure 6A shows an uncorrected fluorescence image. FIG. 6B represents a corrected fluorescence image according to a first embodiment of a correction method according to the invention. Figure 6C shows profiles of pixel intensity of images 6A and 6B along a horizontal line through the center of this image. FIGS. 7A, 7B, 7C and 7D are logic diagrams showing the steps of correction methods respectively according to a first, second, third and fourth embodiment. FIGS. 8A, 8B, 8C and 8D are images respectively obtained without correction, and implementing a correction function according to a first, third and fourth embodiment. FIG. 9 represents a second embodiment of a device according to the invention. FIG. 10 represents a third embodiment of a device according to the invention. FIG. 11 represents a fourth embodiment of a device according to the invention. [0008] Figure 12 shows a fifth embodiment of a device according to the invention. FIG. 13 represents a sixth embodiment of a device according to the invention. DESCRIPTION OF PARTICULAR EMBODIMENTS FIG. 1 shows a first embodiment. A fluorescence probe 1 comprises an excitation light source 11, able to emit an excitation light beam 12, in an Xex excitation spectral band, so as to illuminate an object 10. The object 10 is for example a biological tissue, exposed to the probe 1 during a surgical procedure. The light source is, for example, a light-emitting diode, optionally comprising an excitation filter, capable of blocking the wavelengths outside the XEX excitation spectral band. It may also be a laser diode, or an end of an optical fiber excitation, the other end of this fiber being disposed opposite a light source. The excitation source 11 can be divided into several elementary excitation sources 11.1, 11.2 ... 11.n. [0009] The probe also comprises a transmission image sensor 16, comprising a focusing optical system 18 and a matrix photodetector 19. The emission image sensor 16 is able to form a lem image of a radiation of emission 14 produced by the object 10 under the effect of illumination by the excitation beam 12. This emission radiation 14 can be emitted by the object in the same spectral band as the excitation spectral band . It is then a radiation reflected by the object, or a radiation backscattered by the object. The emission radiation 14 may be emitted by the object in a fluorescence spectral band Xfii, o, different from the Xex excitation spectral band. It is then a fluorescence radiation. The emission image sensor is then a fluorescence image sensor, capable of forming an image / fit ,,, of the fluorescence radiation 14 produced by the object 10, in said fluorescence spectral band, under the effect of the illumination by the excitation beam 12. In the examples which follow, it is systematically placed in the particular case of the acquisition and the correction of a fluorescence image, knowing that the principles described 20 apply to any type of lem image emitted by the object, regardless of the emission spectral band, and in particular an image of the radiation reflected by the object or an image of the radiation backscattered by the object. When the camera has several elementary excitation sources, the latter can be distributed around the fluorescence image sensor 16. [0010] The fluorescence image sensor 16 preferably comprises a filter 17. The filter 17 is a band pass filter, centered in the fluorescence spectral band Xfii, o, so as to block the wavelengths outside this band spectral fluorescence. The focusing optical system 18 is configured to form an image / pu ° of the fluorescence radiation 14 on the photodetector matrix 19. It may in particular be an objective. [0011] The matrix photodetector comprises a plurality of pixels, so that each pixel of the fluorescence image / fit is optically coupled by the focusing optical system 18 to a surface element 6S of the object 10. , such a surface element forming part of a surface S of the object 10 placed facing the probe 1. The fluorescence sensor 16 comprises an optical axis Z defined by the optical focusing system 18 and the photodetector 19. [0012] The matrix photodetector 19 is a CCD type sensor (Charged Coupled Device) or a CMOS type sensor (acronym for Complementary Metal-Oxide Semiconductor). The object 10 comprises one or more endogenous or exogenous fluorophores. In the case where the fluorophores are endogenous, we speak of autofluorescence. The exogenous fluorophores are previously injected into the object, so as to bind specifically to targets, for example cancer cells. Each fluorophore is capable of emitting fluorescence radiation 14, in an Xfii fluorescence spectral band, when it is illuminated by excitation radiation 12, in an Xex excitation spectral band. For example, when the fluorophore used is indocyanine green, or ICG (acronym meaning IndoCyanine Green), the excitation spectral band can be between 750 nm and 800 nm, the fluorescence spectral band being between 820.degree. nm and 870 nm. The fluorescence image / fit ,,, has pixels r. The value of each pixel Ifit, o (r) corresponds to the intensity of the fluorescence radiation emanating from a surface element 65 (r ') of the object, this surface element optically coupled to this pixel. The positions r and r 'respectively denote coordinates of a pixel in the fluorescence image Ifiuo and a surface element on the surface S of the object 10. Each position r and / or r' may correspond to an abscissa x and an ordinate y, such that r = (x, y) and / or r '= (x', y '). The intensity of the fluorescence radiation emanating from each surface element may also be referred to as apparent fluorescence intensity, the fluorescence being able to be produced on the surface of the object or at depth, this depth being generally limited to a few centimeters. The fluorescence image Ifiuo generated by the image sensor 16 gives the appearance that the fluorescence comes from the surface S of the object 10, whereas it can be generated in depth. A processor 30 is adapted to process the fluorescence images (f, 'formed by the image sensor 16. This is for example a microprocessor placed in a computer, in particular the processor is a microprocessor connected to a programmable memory in which is stored a sequence of instructions for performing the image processing operations and calculations described in this description.When the object has a fluorescent zone 13, located in the field of view of the Probe 1, this fluorescent zone appears on the fluorescence image / fit ,, 'in the form of a region of interest O. The region of interest 0 can be characterized by a fluorescence signal Sfi', The probe 1 also comprises a distance detector, in the form of a rangefinder 20, for measuring a distance d between the excitation light source 11 and the object 10. This rangefinder 20 may be an optical, ultrasonic or microwave rangefinder. In general, the rangefinder 20 emits a wave 22 to the object 10, then detects a wave 22 'reflected by the object, from which a measurement of the distance is made. The wave 22 may be an optical wave in the visible or infra-red spectrum, or an acoustic wave or an electromagnetic wave of the microwave type. In this example, the rangefinder 20 is a laser range finder. [0013] In the examples shown in FIGS. 1 and 2, the image sensor 16, the excitation light source 11 and the telemeter 20 are integrated in the probe 1. The probe 1 is generally placed at a distance from the sensor. object 10 between 10 cm and 30 or even 50 cm or more. It can be moved manually by a practitioner, so as to locate any fluorescent zones 13 in the object 10. These fluorescent zones may be located on the surface or a few millimeters, or even centimeters, in depth. The practitioner may for example carry such a probe in his hand, and scan the operative field to perform a complete examination. The illumination 0 produced by the excitation light source 11 on the examined object is often spatially inhomogeneous, and has more illuminated areas than others, defining an illumination image. It is recalled that the illumination 0 corresponds to a quantity of light per unit time and surface. The illumination image generally depends on the distance d between the light source 11 and the object 10 and can vary significantly between two different distances. FIG. 2 represents a front view of a variant according to which the excitation source 11 comprises two elementary excitation sources 11.1 and 11.2 arranged on either side of the fluorescence sensor 16. Each of these sources elementary excitation circuit comprises a plurality of optical fibers, the ends of which is emitted the excitation beam 12. FIGS. 3A, 4A and 5A represent, for different distances d, the spatial distribution of the illumination with such a configuration. The two excitation sources 11.1 and 11.2 are constituted by the end of two optical fibers connected to a laser source emitting at the excitation wavelength of 750 nm. To achieve these figures, there is arranged a homogeneous fluorescent screen facing the probe, the distance between the screen and the probe being respectively 10 cm, 20 cm and 30 cm. Figures 3B, 4B and 5B show the horizontal profile, passing through the center of the image, for each configuration. [0014] It can be seen that whatever the distance, the illumination is not homogeneous. It is maximal in the center of the observed field, and decreases significantly as one gets closer to the edges of the image. Moreover, we observe that the spatial distribution of the illumination varies according to the distance. This spatial heterogeneity impairs the correct quantification of the Sfluo fluorescence signals, the intensity of the fluorescence radiation 14 varying linearly with the illumination 0 produced by the excitation beam 12. Thus, with such spatially inhomogeneous illumination, the signal-to-noise ratio of the probe is not homogeneous: it is much higher in the central zone of the observed field than in the peripheral zone of this field. According to a first embodiment, in order to remedy this problem, the inventors propose applying a correction function fd, dependent on said distance d, to each fluorescence image. This function corrects the fluorescence image 'pu ° so as to obtain a corrected fluorescence image rf' ', such that: Ifluo = fclUfluo) = Md, (la) Md corresponding to an illumination image produced by the light source 11 at the distance 25 d. This illumination image is representative of the spatial distribution of the illumination produced by the excitation source in a plane located at a distance d from the latter. if lu ° The ratio -md implies that the intensity of each coordinate pixel (r) of the fluorescence image ifh ,,, is divided by the intensity of a pixel of the same coordinate of the image 3036187 In other words, it is a ratio between two matrices, term term, so that the value of each pixel r of the corrected image is: rpuo (r) = fd Upuo ( r)) - u (rr) At each distance d is associated an illumination image Md, the latter being determined during a calibration phase, by replacing the object 10 with a homogeneous screen, for example a black screen or, better, a fluorescent screen, the fluorescence being spatially homogeneous. Such a fluorescent screen may consist of a naturally fluorescent material in the fluorescence image sensor Xfiuo wavelength band 16. It may for example be a polymer, for example a plate polymethyl methacrylate, a material known by the acronym PMMA, sold under the reference Altuglas (registered trademark) 25001. The image formed on this screen, placed at a distance d from the light source 11, then corresponds to the illumination image Md associated with said distance d. It comprises light areas and dark areas, representing the spatial distribution, often inhomogeneous, of the illumination 15 0 as a function of the distance d. Several illumination images Mc1 ... Md 'can be made by considering different distances, for example every 1 or 2 cm. Images corresponding to spacings between two distances d ,, d, + 1, can be obtained by interpolation between two images Md; and Mdi ÷ i. We then memorize as many illumination images Md as distances d considered. The illumination images Md 'and Md', respectively associated with two different distances d 'and d', are generally different The different images Md, corresponding to each distance considered, can be stored in a memory 32 connected to the processor 30 An illumination image Md can also be made for different configurations of the focusing optical system 18 used in the fluorescence image sensor 16, in particular according to a magnification G of this optical system, so as to constitute MdG illumination images corresponding to different pairs of values of the distance d, and of the magnification G. Each illumination image Md OR Md, G makes it possible to take into account the vignetting affecting the fluorescence image sensor 16, or , more generally, any inhomogeneity in the response function of this sensor, whether due to the objective 18 or to the matrix photodetector 3036187 12, or to the filter 17. Under these conditions, each illumination image Md OR Md, G can be qualified as a blank image, because it corresponds to a calibration image, on a homogeneous background, of the probe when the light source 11 is activated. Each fluorescent zone 13 of the object can then be characterized not only not its position and its spatial extent, as it appears on the region of interest 0 of the fluorescence image, but also by a quantitative information, under the shape of a signal Sp 'representative of the intensity of the pixels constituting this region of interest. If 0 defines the spatial extent of said region of interest on the fluorescence image, the quantitative information corresponds to the Spuo fluorescence signal. The latter can be defined, in a non-exhaustive way, by one of the following expressions, the maximum intensity in the region of interest 0: S p '= Mei' f luo (r)) the mean intensity in the region of interest 0: Sp '= mean (ne fluo (r)), where mean 12 denotes the average operator; the total intensity in the region of interest 0: S p '= f12 iffluo (r); Ifpuo (r) designates the intensity of a coordinate pixel r in the corrected fluorescence image rftuo. The practitioner is then able to compare the different fluorescence zones not only according to the extent of the zones of interest on the fluorescence image, but also according to their respective intensities, the correction function fd having corrected the spatial heterogeneity of illumination. The correction can be performed in real time, that is to say at the rate of acquisition of the fluorescence images, or in post-processing. The use of such an illumination image Md has the advantage of taking into account, during the application of the correction function, the spatial distribution of the illumination, but also of the evolution of the illumination. illumination as a function of the distance d between the excitation source and the object. However, the intensity of the fluorescent light 12 emitted by the object 10 is proportional to the illumination. Thus, by implementing a correction function taking into account an illumination image, the method makes it possible to correct both the spatial inhomogeneity of the illumination as well as the evolution of the latter, as a function of the distance between the excitation light source 11 and the object 10. [0015] Such a correction makes it possible to obtain a corrected fluorescence image I'fit ,,, in which the value of each pixel f '0 (r) is representative of an apparent fluorophore concentration, in an element of surface of the object 85 (r ') optically couple to said pixel r. Indeed, as previously indicated, lf ,,,,, (r) corresponds to the intensity of the fluorescence light emanating from the surface element 85 (r ') conjugated to the pixel r of the image / fit ,, ,. Thus, If luo (r) oc (Pd (r ') x Ti xc (r') (1 ') where: - Od (r') denotes the illumination experienced by the surface element 85 (r ') of the object, being at the distance d from the excitation source; 10 - ri denotes the quantum yield of the fluorophore generating the fluorescence signal; - c (r ') denotes the apparent concentration of the fluorophore at the surface element; (R ') The term "apparent concentration" refers to a fluorophore concentration at the origin of the fluorescence signal emanating from the surface element 85 (r'). When acquiring the illumination image Md, with the aid of a fluorescent screen, each pixel 15 of the image Md is such that: Md (r) oc (pd (r ') x Ti x co (r') (1 ") where - cpd (r ') means the illumination on the fluorescent screen - ri denotes the quantum yield of the fluorophore generating the fluorescence signal; 20 - co (r') denotes the apparent concentration of fluorophore in the calibration screen at point r ' conjugate of the point r of the image. This concentration is preferably homogeneous in the screen, so that co (r ') = co. It is noted that the same excitation source being used during the respective acquisitions of the fluorescence image and the illumination image. The illumination produced by the excitation source at the distance d, Od (r '), is therefore the same during the acquisition of these two images. The corrected image "f," taking the form of a term-term ratio between the Ifidd fluorescence image and the Md illumination image, is such that: ## EQU1 ## Thus, the corrected image represents the spatial distribution of the fluorophore concentration at the surface. The fluorescence signal Sfluo, as previously defined, associated with a fluorescence zone 13, then represents, according to its definition, the total or average concentration of the fluorophore in the zone concerned. Each normalization image can be normalized by taking into account the value of the pixels in which the illumination is maximum, so that normalization can consist in carrying out the following operation 10 Thus, M 'd (r Md (r)) = M'd (r) representing the value of the coordinate pixel r of the normalized illumination image M'd; Md (r) representing the value of the coordinate pixel r of the illumination image Md; max (Md (r)) representing the value of the pixel of the illumination image Ma representative of the maximum illumination. According to this variant, the correction function makes it possible to establish a corrected fluorescence image, such that: Iffluo = fd (Ifluo) = / flua (Ib) M'd An advantage of such a normalization it allows a correction independent of the intensity of the excitation source, and its possible fluctuations. FIG. 6A represents an uncorrected Ifiuo fluorescence image, when the distance between the source and a homogeneous screen is 20 cm, using a normalized illumination image M'a. FIG. 6B shows a fluorescence image of the corrected fluo using the equation (Ib). Figure 6C shows a horizontal profile, passing through the center of the image, for each of these images. It is observed that the profile associated with the corrected image reflects, with fluctuations, a homogeneous fluorescence signal, corresponding to reality, which is not the case on the uncorrected image. According to another variant of this first embodiment, the correction function is established for a distance at which it is considered that the illumination is sufficiently representative of a certain range of distance. According to this variant, the correction function f does not vary with the distance, and may use the same illumination image M, whether normalized or not, whatever the distance used. In this case, I f luo I f luo = f (Ipuo) = M The implementation of the corrections, whatever they are, described in connection with this first embodiment, preferably supposes that the relative position the fluorescence image sensor and the excitation light source are known and correspond to the conditions under which the illumination images Md are established. Also, such a correction is particularly suitable for probes integrating, in the same body, the light source as well as the fluorescence image sensor, the relative position of the light source and the fluorescence image sensor being fixed. The devices shown in Figures 1, 2, 9, 10 and 11 correspond to this case. [0016] Referring to FIG. 7A, the main steps of a method of correcting a fluorescence image as a function of the distance d between the excitation light source 11 and the object 10 are now described according to FIG. first embodiment. During a step 100, a fluorescence image / fit ,,, is acquired. During a step 200, the distance d between the excitation light source 11 and the object 10 is measured, for example by means of the rangefinder 20 or by means of a type of autofocusing system. autofocus 27, described later. Step 300 consists of applying a distance correction function fd, dependent on said distance c1, to each fluorescence image / fit, so as to obtain a corrected fluorescence image, according to FIG. equation (1a), (1b) or (1c). [0017] According to a second embodiment, the correction function has the effect of compensating for the variations of the illumination intensity as a function of the distance between the camera and the light source 11. [0018] Since the light source 11 is generally placed at a distance greater than 10 cm from the object observed, it can be considered, seen from the object, as punctual. According to this hypothesis, the amount of light illuminating the object per unit area and time, ie the illumination 0, varies inversely proportional to the square of the distance d between the light source 11 and the object 10, according to a so-called inverse square law of type 47d2. However, the amount of fluorescence signal emitted by a fluorescent zone 13 of the object is proportional to the amount of excitation light reaching this zone per unit of time, that is to say to the illumination produced by the excitation light source on said area. When the practitioner moves the probe 1 away from the same fluorescent zone 13, the intensity of the fluorescence radiation emitted by this zone varies, and the intensity of the signal Sf 1 '' of the region of interest appearing on the the fluorescence image is modified accordingly. This variation of intensity is very marked, because of its dependence on the inverse of the square of the distance. For example, if the distance d passes from 20 cm to 40 cm, the intensity variation on the fluorescence image can reach a factor of 4. [0019] If cp and Orel respectively correspond to the illumination produced by the excitation light source 11 at a distance d and at a reference distance dref, df1.)% Oref d2 1 "Also, if If '' 'f ( r) and Ifiuo (r) represent the intensity of a fluorescence radiation, coming from the same surface element 65 (r '), collected by a coordinate pixel r of the image when the light source 11 is respectively placed at the reference distance dref and at the distance d of the object, d2 Ifluo (r) = f wo -ref (r) x: + e2f (3) According to this embodiment, the effect of the correction function fd is to compensate for the distance variation with respect to the reference distance dref Also, if an image is acquired at a fluroescence distance d, the correction function has the effect of reducing the intensity / foo (r) each pixel, corresponding to a fluorescence zone 13, to a reference value / fluo-ref (r) corresponding to the reference distance After correcting, the corrected intensity I fluo (r) at the pixel r is then: 3036187 17 rfiud (r) = fd (if t o (r)) = luo (r) x ci7d2 (4) Count Given equation (3), it comes: fluo (r) iftud-ref (r) (5), the symbol representing an equality with statistical fluctuations. [0020] This reasoning can be applied to all the pixels constituting a fluorescent image. Thus, if Ipuo and Fluo II represent a fluorescence image respectively before and after applying the correction function fa, = fd UpuO) / fat ° d2 2 (6a) Brief 2 Each pixel of the image Ip d uo is then multiplied by the scalar 2. The reference distance dref may be a predetermined distance, for example 20 cm. The reference distance dref can also be determined case by case by the practitioner using the probe 1. The practitioner then has a selector 15, for example a push button, disposed on the probe 1, so that by actuating said selector 15, the distance to which the probe is relative to the object is considered to be a reference distance. This distance is then stored in a memory 32 as a reference distance for the next corrections to be established. This makes it possible to select the reference distance dref on a case by case basis, depending on the fluorescence image (If, 'o) corresponding to this distance, this image being called the reference fluorescence image (Ifluo-ref) - This selector 15 may be provided on all the described embodiments. [0021] Equation (5) shows that the application of the correction function fd causes a same fluorescent zone 13 to produce a region of interest 0 on the fluorescence image corrected by the fluorescence image. Intensity he, tu ° (r) of each pixel is little dependent, even independent, of the distance d between the excitation light source 11 and the examined object 10. fluo According to a variant the rangefinder 20 is an optical rangefinder employs a pixelized photodetector 25 for measuring a plurality of distances d (r ') between the excitation source and a plurality of surface elements 65 (r') of the object (10), respectively. In other words, the telemetry sensor 20 comprises a matrix photodetector, detecting the wave 22 'reflected by the object. Each pixel of this photodetector is able to establish a measurement of the distance separating it from the surface element of the object from which it is optically coupled, that is to say of the conjugate surface element of this pixel. Such a photodetector may for example be a 3D time-of-flight camera, for example the SR4000 model marketed by Mesa Imaging. According to this embodiment, it is possible to associate, at each pixel r of the Ifluo fluorescence image, a distance d (r ') between the excitation source 11 and the surface element 65 (r') conjugated with the pixel r. [0022] The correction can therefore be carried out not by applying a scalar correction term on the entire image if, ', but, like the first embodiment, term by term, so that r0 I' fitio ( r) = fd (I puo (r)) = I fh, o (r) X d (- "'i2Te f2 (6b) Another way of expressing this correction is to consider that from each distance 15 d (r ') measured, an image of distance D is formed, each term D (r) of this distance image corresponding to the distance between the surface element 65 (r') conjugated to the pixel r of the image Ifluo and the excitation light source, the correction then comprises a term-term product of the fluorescence image, if ,, by the square of the image D, weighted by the scalar 2. This can be expressed as follows ## EQU1 ## where ## EQU1 ## the main steps of a method for correcting a fluorescence image e n is a function of the distance d between the excitation light source 11 and the object 10, according to this second embodiment. Steps 100 and 200 are similar to those described with respect to Figure 7A. Step 400 consists of applying a distance correction function fd, dependent on said distance d, to each fluorescence image / fit ,, 'so as to obtain a corrected fluorescence image rf' ', according to the equation (6a), (6b) or (6c). The correction functions described in the first and second embodiments can be implemented independently of one another, but also combined. [0023] Thus, according to a third embodiment, the correction function combines the correction of the spatial inhomogeneity of the illumination and the compensation of the distance. A correction function is applied to the image combining the correction functions described in connection with the first or the second embodiment. This function fd may be such that: 5 1 fluo d2 Ii fluo = fd (Ifluo) = X = (7a) Me a short OR, li f lu ° = fd (Ifluo) = Ifiuo xd (ri) 2 (7b) Mid r, 2 "'ef or again rftuo = fd (iftuo) = Ifiuo"' -. ,, 1 D (r) 2 (7c) Mid Short f Here I am referring to the image of illumination standardized, as described in a variant of the first embodiment. [0024] Referring to FIG. 7C, the main steps of a method for correcting a fluorescence image as a function of the distance d between the excitation light source 11 and the object 10 are described, according to FIG. second embodiment. Steps 100 and 200 are similar to those described with respect to Figure 7A. Step 500 consists in applying a function of distance correction fa, dependent on said distance c1, to each fluorescence image / fit, so as to obtain a corrected fluorescence image rf '', corrected, according to FIG. equation (7a), (7b) or (7c). Other known correction functions can also be taken into account, in particular a correction function relating to the exposure time, acting on each image so as to generate normalized images according to a reference time tref, defined beforehand. for example 50 ms. [0025] Thus, in a fourth embodiment, the correction function fd may include a normalization term, in the form of a ratio between the exposure time t of the fluorescence image and the reference exposure time. tref. For example, according to this embodiment, the correction function fd may be such that: ## EQU1 ## where ## EQU1 ## where ## EQU1 ## Ifluo) = Ifiuo xtref (8b) Md t 3036187 20 OR: iffidd / iuo d2X = tref fd (if / do) = M'd dref t (8C) We note that equations (8a) and (8c) use the normalized illumination image M'a whereas equation (8c) uses the non-normalized illumination image Md. Naturally, when the distance measurement is resolved spatially, that is to say when establishes a distance d (r ') between each surface element 85 (r') conjugated by a pixel r of the fluorescence image, then: Ifluo xd (ri) 2 x tref Ii fluo = fd (I m, fluoj = (8d) c1t "ref OR: -wo = fd (Ifluo) = ifiu ° X (1-d X tref (8e) Me 4ef t or again rpuo = fd (Ifluo) = fu ° 1 (-) 2D X tref (8f) M'd c172-ef t We note that the equations (8a), (8c), (8d), (8e) and (8f) use the normalized illumination image M'a whereas equation (8b) u uses the non-normalized illumination image Ma. In connection with FIG. 7D, the main steps of a method for correcting a fluorescence image as a function of the distance d between the light source of FIG. excitation 11 and the object 10, according to this second embodiment. Steps 100 and 200 are similar to those described with respect to Figure 7A. Step 600 consists of applying a function of distance correction fa, dependent on said distance cl, to each fluorescence image / fit ,, 'so as to obtain a corrected fluorescence image, according to one of equations (8a). at (8f). FIGS. 8A to 8D illustrate the effect of the previously described correction functions on an image. [0026] Figure 8A was obtained by placing the probe 1 shown in Figure 2 facing a pattern with dark patterns formed on a fluorescent screen as previously described. The pattern is illuminated by an excitation beam, in an excitation spectral band centered on the 750 nm wavelength, and the fluorescence image is produced by collecting a fluorescence light in a fluorescence spectral band. extending between 800 nm and 850 nm. [0027] The probe is placed the camera at a distance d of 18cm; the exposure time t rises to 40 MS. [0028] FIG. 8B represents an image produced using a correction function as described in the first embodiment, dividing, term by term, the Iftuo fluorescence image by a normalized illumination image M'a, corresponding at the distance d = 18 cm. It is observed that the gray levels of the image obtained, apart from the patterns of the pattern, are more homogeneous. In particular, the edges of the image are not darkened as in the image 8A. FIG. 8C shows an image made using a correction function as described in the third embodiment, combining the illumination correction and the distance compensation, the reference distance dref being equal to 20 cm. A slight darkening of the image is observed, due to the multiplication of each pixel by the scalar 182 - '- - -. FIG. 8D shows an image made using a correction function as described in the fourth embodiment, combining the illumination correction, the compensation of the distance with respect to a reference distance of 20 cm, and the normalization of the exposure time, the reference exposure time ff being 83 ms The comparison between the images 8C and 8D makes it possible to visualize the effect of the normalization by the exposure time, with a significant increase in the gray level of the lighter pixels due to the multiplication of each pixel by the scalar -84305, --- 2. Fig. 9 shows another embodiment, in which the excitation light source 20 is integrated in the probe 1. According to this embodiment, the distance measurement between the excitation light source 11 and the object 10 is not carried out using a rangefinder, but using a system autofocus 27, ap The autofocus system 27 is conventional for autofocusing the fluorescence image sensor 16 with respect to the object. It focuses on a fluorescence image contrast measurement. considering different focusing distances. According to one variant, the autofocus system 27 is based on a so-called phase detection method, also known to those skilled in the art. The autofocus system 27 is able to provide a distance information d between the excitation light source 11, integral with the probe 1, and the examined object. It then acts as a distance detector. [0029] The inventors consider, however, that it is preferable to have a telemeter 20 of the optical, acoustic or microwave type, as previously mentioned, since this makes it possible to obtain a more precise and faster measurement of the distance. . Indeed, the fluorescence image is often not sufficiently contrasted to allow a sufficiently reliable measurement of this distance using an autofocus system. In addition, autofocus autofocus determination may be slow, and not compatible with the real-time constraints required during surgery. FIG. 10 represents another embodiment, in which the probe 1 comprises a separator 25, for example a semi-reflecting plate, capable of directing the fluorescence optical radiation 14, incident on the probe, towards the matrix photodetector 19, through the fluorescence filter 17 previously described, to form a fluorescence image (/ f, '0). The separator 25 is also able to direct an incident optical radiation 24 to the probe 1, in a spectral band visible to a second matrix photodetector 29, able to form an image in visible / visible light of the examined object. The matrix photodetector 29, the separator 25 and the objective 18 form a visible image sensor 26. By visible spectral band is meant a spectral band extending in the visible wavelengths, for example between 400 nm and 700 nm. The processor 30 can then perform a superposition between the fluorescent image / fit ,, 'generated by the fluorescence image sensor 16, and the visible image of the object, / visible, generated by the image sensor. Visible 26. Preferably, the optical axes of the visible image sensor 16 and the fluorescence image sensor 26 are merged, so as to facilitate such superimposition. In this example, the objective 18 is common to the visible image sensor 26 and the fluorescence image sensor 16. According to one variant, the visible image sensor comprises a lens 28 that is specific to it. The visible image sensor 26 may include an autofocus system, or autofocus, 27, whereby the visible / visible image of the object is used for autofocusing. The autofocus system 27 is able to provide information of distance d between the excitation light source, integral with the probe 1, and the examined object. It then acts as a distance detector. It should be noted that the integration of a visible image sensor 26, able to form an image of the object in visible / visible light, can be provided in all embodiments described in this application. [0030] FIG. 11 shows an embodiment in which the rangefinder 20 is an optical range finder. Through a semi-reflecting mirror 25 ', the rangefinder 20 is able to emit a light wave through the optical system 18 of the fluorescence sensor 16. The semireflecting mirror 25' reflects the optical wave 22 emitted by the rangefinder 20, as well as the optical wave 22 'reflected by the object. It transmits the fluorescent light 14 emitted by the object 10 to the fluorescence photodetector 19. Preferably, the rangefinder is positioned so that the light wave it generates is centered with respect to said optical system 18. It is then propagates towards the object 10 along the optical axis Z of the fluorescence sensor 16. This makes it possible to measure the distance 10 between the probe 1 and the point of the object situated in the center of the image acquired by the sensor 16, regardless of the distance between the probe and the object. We then know precisely the area of the object targeted by the rangefinder 20, from which the distance d is determined. FIG. 12 represents another embodiment, in which the excitation light source 11 and the telemeter 20 are arranged in a module 2 independent of the probe 1. [0031] For example, according to this embodiment, the module 2 is a surgical light in which the telemeter and the excitation light source 11 are integrated. The operating light then makes it possible to illuminate the object 10 not only in a wide band. visible spectral, in the same way as a conventional surgical light, but also more specifically in the Xex excitation spectral band of the fluorophores potentially present in the object. The integration of the excitation light source 11 into a surgical light generally makes it possible to obtain a more uniform spatial distribution of the illumination. According to this embodiment, it is preferable to implement a correction according to the square of the distance, exposed in connection with FIG. 7B, supplemented optionally by a correction according to the exposure time. FIG. 13 shows another embodiment, in which the excitation light source 11 is integrated in a module 2 independent of the probe 1, whereas the rangefinder 20 is integral with the probe 1. The position of the module 2 relative to the probe 1 is known, in particular the distance d2 between the excitation light source 11 and the rangefinder 20, and possibly the orientation of the module 2 with respect to the probe 1. According to this embodiment, the rangefinder 20 measures the distance d1 separating it from the object 10, the processor 30 then being able to determine the distance d between the excitation light source 11 and the object 10 as a function of the measurement of the distance d1 and of the knowledge of the distance d2, and possibly the orientation, of the module 2 with respect to the probe 1. According to this embodiment, it is preferable to implement a correction according to the square of the distance, exposed in link with the figur e 7B, possibly supplemented by a correction according to the exposure time. In the embodiments described in FIGS. 1, 2, 11, 12 and 13, the rangefinder 20 may be a spatially resolved range finder, making it possible to obtain a plurality of distances d (r ') between the excitation source respectively. and a plurality of surface elements 85 (r ') of the object (10) as previously described. The probe 1 as previously described can be implemented in open surgery applications, which allows the observation of an operating field. When it integrates the excitation light source 11, as shown in FIGS. 1, 8 and 9, it can also be used in other modalities, in particular in endoscopy or in laparoscopy, with an adaptation of FIG. its dimensions. As stated in the first paragraphs of the detailed description, although the described embodiments relate to fluorescence imaging, which is the main short-term targeted application, the described principles can be generalized to any type of image. , emitted by the object, whatever the emission spectral band, and in particular an image of radiation reflected by the object or an image of radiation backscattered by the object, the emission spectral band corresponding then to all or part of the excitation spectral band. 20
权利要求:
Claims (16) [0001] REVENDICATIONS1. A method of correcting a fluorescence image, comprising the steps of: illuminating an object (10) with an excitation light source (11), detecting a fluorescence radiation (14) by a fluorescence image sensor (16), the fluorescence radiation (14) being emitted by the object (10) under the effect of said illumination, acquisition of said fluorescence image (if, ',,) by said fluorescence image sensor (16) from the detected fluorescence radiation (14), the method being characterized by comprising applying an illumination correction function (f, fd) to the fluorescence image (if, ',,), said correction function using an illumination image (M, Md, M'd, MG, d), representative of the spatial distribution of an illumination (0) produced on the object (10) by the excitation light source (11). [0002] 2. correction method according to claim 1, also comprising a step of measuring a distance (d) between the excitation source (11) and the object (10), said correction function (fd) being dependent on this measured distance (d), using an illumination image (Md, M'd, MG, d), associated with said distance (d). [0003] 3. A correction method according to claim 2, wherein the correction function is such that two illumination images (Md ', Md'), respectively associated with two different measured distances (cl, d '), are different from each other. one of the other. [0004] 4. correction method according to any one of claims 1 to 3, said correction function (f, fd) comprising a ratio, term by term, of the fluorescence image (if, ',,) by said image d illumination (M, M'd, Md, MG, d). 3036187 26 [0005] 5. Correction method according to any one of claims 1 to 4, wherein the fluorescence image is acquired according to an exposure time (t), said correction function is also able to normalize the fluorescence image by compared to a reference exposure time (tref). 5 [0006] 6. correction method according to any one of claims 2 to 5, wherein said correction function is also adapted to correct said fluorescence image as a function of the square of said measured distance (d). [0007] 7. A correction method according to claim 6, wherein the correction function takes into account the square of said measured distance (d) and the square of a reference distance (dref). [0008] 8. correction method according to claim 7, comprising a step of selecting said reference distance (d ref) according to the fluorescence image ((f, ',,) [0009] 9. Correction method according to any one of claims 7 or 8, wherein the correction function (fd) makes a product of said fluorescence image ((f, ',,) by a ratio dZ 15) between the square of said measured distance and the square of said reference distance. Short [0010] The correction method according to any one of claims 2 to 9, wherein the distance (d) between the excitation light source (11) and the object (10) is measured by means of a range finder ( 20) adapted to emit a wave (22) towards the object (10) and to detect a wave (22 ') reflected by said object, according to which said distance (d) is measured. 20 [0011] The correction method of claim 10, wherein the range finder (20) is a laser range finder or an ultrasonic range finder or a microwave range finder. [0012] 12. Correction method according to any one of claims 2 to 9, wherein the distance (d) between the excitation light source and the object (10) is measured by an autofocus system (27), suitable for automatically establishing a focus distance of the fluorescence image sensor (16) relative to the object (10). [0013] 13. Correction method according to any one of the preceding claims, also comprising a step of acquiring a visible (visible) image of the object (10) using a visible image sensor ( 26) 3036187 27 [0014] Apparatus for acquiring (1) a fluorescence image comprising: an excitation light source (11), able to illuminate an object (10) in an excitation spectral band Rex)) a sensor of fluorescence image (16), capable of collecting fluorescence radiation (14) emitted by said object (10), in a fluorescence spectral band (X 0) under the effect of illumination by the light source d excitation, the fluorescence image sensor being adapted to acquire a fluorescence image (ifh, o) from the fluorescence radiation collected, the fluorescence image sensor being centered around an optical axis (Z) 10 a rangefinder (20) for measuring a distance (d) between the excitation light source (11) and the object (10), the rangefinder (20) being able to emit an optical wave (22) in the direction of the object (10) and detecting an optical wave (22 ') reflected by said object, according to which said distance (d) is measured, the acquisition device being characterized in that the rangefinder (20) is capable of transmitting said optical wave (22) along said optical axis (Z), and in that it also comprises a processor, able to implement the method of correcting said fluorescence image (Ifluo) according to claims 1 to 13. [0015] 15. Fluorescence image acquisition device according to claim 14, characterized in that the fluorescence sensor comprises a focusing optical system (18), the rangefinder (20) being configured to emit an optical wave, direction of the object, centered according to this optical system. [0016] 16. Apparatus for acquiring a fluorescence image according to claim 14 or claim 15, characterized in that it also comprises a selector (15) of a reference distance (f) for storing said distance reference in a memory (32). 25
类似技术:
公开号 | 公开日 | 专利标题 US9407796B2|2016-08-02|System for reconstructing optical properties in a diffusing medium, comprising a pulsed radiation source and at least two detectors of two different types, and associated reconstruction method US7583380B2|2009-09-01|Spectroscopic analysis apparatus and method with excitation system and focus monitoring system FR3036187B1|2019-09-13|METHOD OF CORRECTING A FLUORESCENCE IMAGE JP5519152B2|2014-06-11|Device for acquiring information about anatomical samples using optical microscopy EP2465416B1|2013-11-20|Method for locating an optical marker in a diffusing medium EP1994882B1|2010-09-22|Bi-spectral peroperative optical probe US20060142662A1|2006-06-29|Analysis apparatus and method comprising auto-focusing means FR2532756A1|1984-03-09|SYSTEM FOR AUTOMATIC OBSERVATION AND QUANTIFICATION OF PHENOMENA THAT CAN BE DETECTED BY FLUORESCENCE EP2182343A1|2010-05-05|Method and device for measuring the optical characteristics of an object FR2882147A1|2006-08-18|FLUORESCENCE IMAGING DEVICE BY TWO WAVE LENGTH REFLECTION US7692160B2|2010-04-06|Method and system of optical imaging for target detection in a scattering medium EP3294111B1|2019-08-07|Device and method for observing an object, taking into consideration the distance between the device and the object JP5911196B2|2016-04-27|Photoacoustic imaging device JP6082321B2|2017-02-15|Surgical microscope system EP1948008A2|2008-07-30|Absorption and scattering map reconstruction for optical fluorescence tomography FR2950431A1|2011-03-25|DEVICE AND METHOD FOR SPATIAL RECONSTRUCTION OF FLUORESCENCE CARTOGRAPHY CN109085119A|2018-12-25|A kind of the copolymerization coke 3-D imaging system and implementation method of the detection of Raman tomographic spectroscopy CN110426372B|2021-10-22|Elastic modulus imaging detection method for frequency-sweeping Brillouin scatterer EP3824799A1|2021-05-26|Device, apparatus and method for imaging an object FR2824922A1|2002-11-22|Image intensifier particularly for use with a medical X-ray system in which image distortion is corrected using a localization target projected onto the primary image screen FR3061849A1|2018-07-20|METHOD AND DEVICE FOR MEASURING FLUORESCENCE EMITTED TO THE SURFACE OF A BIOLOGICAL TISSUE EP2715429B1|2021-08-18|Improved biomedical imaging with multiphoton excitation FR3037653A1|2016-12-23|METHOD FOR PROCESSING RETRANSMITTING SIGNAL
同族专利:
公开号 | 公开日 EP3295153A1|2018-03-21| US20180136129A1|2018-05-17| US10634615B2|2020-04-28| WO2016181076A1|2016-11-17| FR3036187B1|2019-09-13|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US5749830A|1993-12-03|1998-05-12|Olympus Optical Co., Ltd.|Fluorescent endoscope apparatus| US6280378B1|1998-05-29|2001-08-28|Fuji Photo Film Co., Ltd.|Fluorescence endoscope| EP1829473A2|2006-02-23|2007-09-05|Olympus Corporation|Endoscope observation device, observation device and observation method using endoscope| EP2412296A1|2009-03-24|2012-02-01|Olympus Corporation|Fluorescence observation device, fluorescence observation system, and fluorescence image processing method| US20130113907A1|2010-07-06|2013-05-09|Olympus Corporation|Fluorescence observation apparatus| EP2505988A1|2011-03-29|2012-10-03|Fujifilm Corporation|Imaging device, imaging program, imaging method, and computer readable storage medium| AT409042B|1999-11-24|2002-05-27|Life Optics Handel Und Vertrie|sehhilfe| EP1152261A1|2000-04-28|2001-11-07|CSEM Centre Suisse d'Electronique et de Microtechnique SA|Device and method for spatially resolved photodetection and demodulation of modulated electromagnetic waves| JP5461753B2|2004-07-30|2014-04-02|オリンパス株式会社|Endoscope device| CN101561405B|2008-04-17|2011-07-06|清华大学|Straight-line track scanning imaging system and method| US8193845B2|2010-07-06|2012-06-05|Microchip Technology Incorporated|Binary-weighted delta-sigma fractional-N frequency synthesizer with digital-to-analog differentiators canceling quantization noise| CN102466520B|2010-11-11|2014-12-17|香港纺织及成衣研发中心|Multispectral imaging color measurement system and imaging signal processing method thereof| US9860510B2|2013-03-15|2018-01-02|Intuitive Surgical Operations, Inc.|Depth based modification of captured images| US9989623B2|2013-06-13|2018-06-05|Basf Se|Detector for determining a longitudinal coordinate of an object via an intensity distribution of illuminated pixels| AU2014310703B2|2013-08-19|2018-09-27|Basf Se|Optical detector| US20170074652A1|2014-04-22|2017-03-16|Basf Se|Detector for optically detecting at least one object| US9782136B2|2014-06-17|2017-10-10|The University Of North Carolina At Chapel Hill|Intraoral tomosynthesis systems, methods, and computer readable media for dental imaging| CN107003785B|2014-12-09|2020-09-22|巴斯夫欧洲公司|Optical detector|FR3061849A1|2017-01-17|2018-07-20|Fluoptics|METHOD AND DEVICE FOR MEASURING FLUORESCENCE EMITTED TO THE SURFACE OF A BIOLOGICAL TISSUE| US11147453B2|2017-10-03|2021-10-19|Canon U.S.A., Inc.|Calibration for OCT-NIRAF multimodality probe| US10952616B2|2018-03-30|2021-03-23|Canon U.S.A., Inc.|Fluorescence imaging apparatus| WO2021216740A1|2020-04-21|2021-10-28|Cytoveris Inc.|Uv excited multi spectral fluorescence based tissue analysis with raman spectroscopy zoom-in scanning|
法律状态:
2016-05-27| PLFP| Fee payment|Year of fee payment: 2 | 2016-11-18| PLSC| Search report ready|Effective date: 20161118 | 2017-05-30| PLFP| Fee payment|Year of fee payment: 3 | 2018-05-28| PLFP| Fee payment|Year of fee payment: 4 | 2019-05-31| PLFP| Fee payment|Year of fee payment: 5 | 2020-05-30| PLFP| Fee payment|Year of fee payment: 6 | 2021-05-31| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1554260|2015-05-12| FR1554260A|FR3036187B1|2015-05-12|2015-05-12|METHOD OF CORRECTING A FLUORESCENCE IMAGE|FR1554260A| FR3036187B1|2015-05-12|2015-05-12|METHOD OF CORRECTING A FLUORESCENCE IMAGE| US15/573,339| US10634615B2|2015-05-12|2016-05-11|Method of correcting a fluorescence image| EP16729017.0A| EP3295153A1|2015-05-12|2016-05-11|Method of correcting a fluorescence image| PCT/FR2016/051114| WO2016181076A1|2015-05-12|2016-05-11|Method of correcting a fluorescence image| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|