![]() METHOD AND SYSTEM FOR FULL FIELD INTERFERENTIAL MICROSCOPY IMAGING
专利摘要:
An imaging system (20) by full-field interference microscopy of a scattering volume sample (206) comprising: - an interference device (200) comprising a reference arm on which a reflection surface (205) is arranged, the an interference device being adapted to produce, when the sample is disposed on an object arm of the interference device, at each point of an imaging field, interference between a reference wave obtained by reflection of light waves incidental on an elementary surface of the reflection surface (205) corresponding to said point of the imaging field and an object wave obtained by backscattering of incident light waves by a voxel of a slice of the sample at a given depth, said voxel corresponding to said point of the imaging field, - an acquisition device (208) adapted to acquire, in contrast to a fixed step between the object arm and the reference arm, a temporal succession of N two-dimensional interferometric signals resulting from the interference produced at each point of the imaging field, - a processing unit (220) configured to calculate an image (IB, IC) representative of temporal variations of intensity between said N interferometric signals dimensional. 公开号:FR3034858A1 申请号:FR1553120 申请日:2015-04-10 公开日:2016-10-14 发明作者:Albert Claude Boccara;Fabrice Harms 申请人:LLTECH MANAGEMENT; IPC主号:
专利说明:
[0001] TECHNICAL FIELD The present disclosure relates to a method and an imaging system using full-field interference microscopy. It is applicable in particular to cellular and intracellular imaging. [0002] STATE OF THE ART The technique of image acquisition by incoherent light field interference microscopy, known as full-field OCT (OCT is the abbreviation of the acronym "Optical Coherence Tomography"), is a non-invasive, non-destructive and endogenous method that is very efficient for the acquisition of images of biological tissues. [0003] For example, the full-field OCT imaging technique is described in the article "Full-field optical coherence tomography" by A. Dubois and C. Boccara, from "Optical Coherence Tomography - Technology and Applications" - Wolfgang Drexler - James G. Fujimoto - Editors - Springer 2009. The full-field OCT imaging technique is also described in the French patent application FR2817030. [0004] The full-field OCT imaging technique is based on the exploitation of backscattered light by a sample when illuminated by a light source with a short coherence length, and in particular the exploitation of backscattered light by microscopic structures. cell and tissue in the case of a biological sample. This technique exploits the low coherence of the light source to isolate the backscattered light by a virtual slice deep in the sample. The use of an interferometer makes it possible to generate, by an interference phenomenon, an interference signal representative of the light selectively coming from a given slice of the sample, and to eliminate the light coming from the rest of the 'sample. The full-field OCT imaging technique provides three-dimensional images with a typical resolution of the order of 1 μm, which is higher than the resolutions of the order of 10 μm that can be obtained with other conventional OCT techniques such as OCT in the spectral domain (known by the acronym "Fourier-Domain OCT" or "spectral domain OCT"). With such a resolution, one can visualize the majority of the tissue structures of blood vessels, their walls, collagen, adipocytes, etc. This technique thus makes it possible to visualize microstructures of various biological tissues such as the brain, breast, skin, liver, etc., as described for example in the article: "Large Field, High Resolution Full-Field Optical Coherence Tomography : A Pre-Clinical Study of Human Breast Tissue and Cancer Assessment, "O. Assayag et al., Technology in Cancer Research and Treatment Volume 13, No. 5, October 2014. In addition, this technique is particularly rapid: it is thus It is possible to generate, using a full-field OCT microscope, a representative image of a slice at depth whose surface is several cm2 in only a few minutes. [0005] FIG. 1 is an example of an image obtained by the full-field OCT technique for a tissue sample of a liver (optical biopsy made on a rat liver) on which collagen fibers are visible in particular. The image, referenced IA, corresponds to a sample area of size 800 μm by 800 μm (micron). This figure illustrates the possibilities of this technique for highlighting and visualizing very fine biological structures. It has been shown that the full-field OCT imaging technique is potentially usable for medical applications such as cancer diagnosis, since the visualization of microstructures and tissue architecture allows in a large number of cases of distinguish a healthy tissue from a cancerous tissue in a few minutes, non-destructively. A full-field OCT image during the time of the operation makes it possible, for example, to perform a first level of diagnosis, or even to optimize the surgical procedure in the case, for example, of cancers, to possibly avoid a re-operation in the case where histological analysis would reveal the presence of tumor cells only a few days after the excision. In addition, the full-field OCT technique allows the acquisition of in-vivo images, possibly in-situ with, for example, endoscopes, which reinforces the relevance of the technique in the medical field. However, although the full-field OCT technique has sufficient theoretical spatial resolution to generate biological tissue images at the cellular and intracellular scale (of the order of 4, tm in all three dimensions), applicants have shown that in practice, it does not always make it possible to clearly distinguish cells, or even nuclei or internal structures of cells (membrane, nucleus, cytoplasm in particular) because of insufficient contrast in the images obtained. For example, in the example of FIG. 1, the backscattered signal from collagen fibers has a high intensity due to the high refractive index of the lipid and protein structures. On the other hand, the backscattered signal coming from internal structures of the cells is of very weak intensity, approximately 400 times weaker than that of the collagen fibers. The difference in intensity of the signals backscattered by the different structures of the cell and the tissue explains that, despite the excellent spatial resolution of the OCT full-field technology, this technology does not make it possible to acquire the low-intensity signals with sufficient dynamics. . [0006] Other microscopy techniques used in anatomopathology allow for the visualization, by means of a microscope, of cells and structures of cells on a tissue sample: these techniques consist in performing a tissue sample in an area to be analyzed, then to prepare an observable tissue slide under a conventional microscope. The preparation consists in putting the tissue in paraffin, cutting very thin slices therein - of the order of 3 to 5 μm in thickness - and in applying to these slices contrasting agents or dyes making it possible to enhance the contrast of the cell structures and thus facilitate their visualization by the anatomopathologist. Typically, a combination of hematoxylin and eosin is applied to these slices to specifically stain support structures and cell nuclei. [0007] Other colorations may also be applied to allow visualization of certain structures even more specifically. The resulting tissue slice is then placed on a microscope slide and observed at various magnifications by the pathologist. Macroscopic examination at low magnification as well as microscopic examination at high magnification allows to observe the tissue architecture as well as the cellular details present on the tissue sample. A set of reading criteria makes it possible to diagnose cancer, for example by examining the stromal reaction resulting from the tumor invasion or by examining the nucleus / cytoplasm ratio at the level of the individual cell. These microscopy techniques, however, have several disadvantages. In the first place, these techniques are destructive, since, unlike the full-field OCT technique which makes it possible to obtain an image of a virtual slice of the tissue sample, these techniques require the preparation of a physical slice of the sample. as described. The fabric will no longer be usable for other types of analysis. On the other hand, these techniques rely on a complex preparation of the sample, including its physical handling during multiple stages (paraffinization, cutting, labeling) as well as the use of exogenous contrast agents such as specific dyes of cellular structures: the final quality of the sample thus prepared, and consequently the quality of the corresponding diagnosis is therefore dependent on the operator. Finally, these techniques are not suitable for intraoperative use, because taking into account the different steps to prepare the tissue, it usually takes several days before being able to observe the tissue. There is therefore a need for an imaging technique for acquiring images of a sample, in particular of a biological tissue, which overcomes at least some of the limitations and / or disadvantages of the state of the art. . According to a first aspect, the present disclosure relates to a full-field interference microscopy imaging system of a scattering volume sample comprising: an interference device comprising an object arm for receiving the sample and an arm of reference on which a reflection surface is arranged, the interference device being adapted to produce, when the sample is placed on the object arm of the interference device, at each point of an imaging field, an interference between a reference wave obtained by reflection of incident light waves on an elementary surface of the reflection surface corresponding to said point of the imaging field and an object wave obtained by backscattering of incident light waves by a voxel of a slice of the sample at a given depth, said voxel corresponding to said point of the imaging field, an acquisition device has adapted to acquire, in contrast to a fixed path between the object arm and the reference arm, a temporal succession of N two-dimensional interferometric signals resulting from the interferences produced at each point of the imaging field, - a processing unit configured to calculate an image representative of temporal variations in intensity between said N two-dimensional interferometric signals. The imaging field is limited in its dimensions for example by a useful detection surface of the acquisition device or by a field diaphragm of the imaging system. Thus, according to one example, the imaging field at the level of the sample slice (or coherence slice) is the zone conjugated with the useful surface of the acquisition device (or with the field diaphragm). A "point" of the imaging field is defined in its dimensions by the dimensions of an elementary detector of the acquisition device. An elementary surface of the reflection surface corresponding to a point of the imaging field represents in the present description an elementary surface defined on the reflection surface and whose position corresponds to the position of the point of the imaging field defined relative to a two-dimensional mark associated with the acquisition device. A voxel corresponding to a point of the imaging field represents, in the present description, an elementary volume defined in the coherence slice and whose position in the coherence slice corresponds to the position of the point of the imaging field defined relative to a reference mark. two-dimensional associated with the acquisition device. A wave backscattered by a voxel is representative of the amplitude of the coherent sum of backscattered waves by all the scattering elementary structures present in this voxel. The imaging system thus described makes it possible to obtain an image of the sample representative of the temporal variations of intensity between the interferometric signals with a difference in a fixed step, and of which the applicants have shown that it makes it possible to access information not perceptible by means of the images obtained according to the full field OCT techniques according to the prior art. By fixed step difference, it is understood here that the optical path variation between the reference arm and the object arm is constant; in other words, there is no relative variation in the position of the sample with respect to the reflection surface. The advantages associated with this imaging technique in the field of cellular or intracellular imaging stem in particular from the observations made by the applicants as to the activity occurring at the cellular or intracellular level, and more specifically of the different movements at the origin of the temporal variations of intensity between the interferometric signals: flow or circulation of liquids and also intracellular movements of different organelles (or "organelles") such as cytoskeleton, nuclei of cells, mitochondria, lysosomes, etc. [0008] According to one or more embodiments of the imaging system, the processing unit is configured to calculate the image by computing, for each pixel of position given in this image, a pixel value as a function of a value of a parameter representative of the temporal variations in intensity of the N two-dimensional interferometric signals acquired at a corresponding position point in a two-dimensional coordinate system associated with the acquisition device. The extraction of a parameter representative of the fluctuations or temporal variations, over a period of time, of the intensities of the interferometric signals acquired at a given point makes it possible to show, in the calculated image, tissue and cellular regions of the sample in which movements occur. According to one or more embodiments, this parameter is a parameter representative of the time dispersion of the intensities of the N two-dimensional interferometric signals considered. Such a parameter is for example the standard deviation of the statistical distribution of the intensities. An overall measurement representative of the temporal dispersion of the backscattered light intensities at a given location of the biological tissue is thus made. An image representation of the values obtained for this parameter makes it possible to highlight and visualize the tissue regions where movements occur. According to one or more embodiments of the imaging system, a pixel of the image has at least one component, defined relative to a color representation space, whose value is a function of the value of the chosen parameter. For example, in the case of a grayscale image, areas of the sample that are animated by a large movement and for which the value of this parameter is therefore high, will appear in such images with a gray level Student. Conversely, the parts for which no movement is detected and having a zero parameter value, will have a very low gray level. This image acquisition method is particularly interesting in the case of fabrics with a large variety of structures and therefore of diffusers, for which the dynamics of the backscattered signal is very extensive. According to one or more embodiments of the imaging system: the acquisition device is configured to further acquire two-dimensional interferometric signals for the sample wafer at different values of the difference in operation between the two arms of the device The processing unit is configured to compute an image, called the combined image, from the intensities of the P two-dimensional interferometric signals and said temporal variations of intensity. Compared with a tomographic image obtained by a prior art full-field OCT imaging method, the combined image constitutes a more complete and information-rich representation of the sample on the different structures present in the sample. it combines, on the one hand, the information that can be obtained by a full-field OCT imaging process according to the prior art, this information being representative of the intensities of the signals backscattered by the different voxels and, d ' on the other hand, the information representative of the temporal variations of intensity of the signals backscattered by the different voxels. [0009] According to one embodiment of the imaging system, the processing unit is configured to calculate the combined image by calculating, for a given position pixel in the combined image, a pixel value based on on the one hand, intensities of the P two-dimensional interferometric signals acquired at a corresponding position point in a two-dimensional coordinate system associated with the acquisition device and, on the other hand, temporal variations in intensity of the N two-dimensional interferometric signals acquired at a point corresponding position in a two-dimensional coordinate system associated with the acquisition device. The various embodiments of the imaging system according to the present description are combinable with each other. [0010] According to a second aspect, the present description relates to a method of imaging by full-field interference microscopy of a diffusing volume sample comprising: the placing of the sample on an arm object of an interference device; producing, by means of the interference device, for each point of an imaging field, an interference between a reference wave obtained by reflection of incident light waves on an elementary surface of a reflection surface a reference arm of the interference device, said elementary surface corresponding to said point of the imaging field, and an object wave obtained by backscattering of the incident wave by a voxel of a portion of the sample at a depth given, said voxel corresponding to said point of the imaging field, - the acquisition, at a difference of fixed step between the object arm and the reference arm, of a temporal succession of N signal x two-dimensional interferometry resulting from the interference produced at each point of the imaging field, - calculating an image representative of temporal variations of intensity between the N two-dimensional interferometric signals. According to one embodiment of the imaging method, the step of calculating the image comprises calculating, for each pixel of position given in the image, a pixel value as a function of a value of a parameter representative of the temporal variations in intensity of the N two-dimensional interferometric signals acquired at a corresponding position point in a two-dimensional coordinate system associated with the acquisition device. According to a particular embodiment, this parameter is representative of the temporal dispersion of the intensities of the N two-dimensional interferometric signals acquired. According to one embodiment of the imaging method, a pixel of the image has at least one component, defined relative to a colorimetric representation space, whose value is a function of the value of the chosen parameter. According to one embodiment, the imaging method further comprises: - acquiring at least P two-dimensional interferometric signals for the sample wafer at different values of the difference of gait between the two arms of the device; interference, the calculation of an image, called the combined image, from the intensities of the P two-dimensional interferometric signals and said temporal variations of intensity. According to one embodiment of the imaging method, the calculation of the combined image comprises, for each pixel of the given position of the combined image, the calculation of a pixel value as a function, on the one hand, of the intensities of the P two-dimensional interferometric signals acquired at a corresponding position point in a two-dimensional coordinate system associated with the acquisition device and, on the other hand, temporal variations in intensity of the N two-dimensional interferometric signals acquired at a corresponding position point in a bidimensional reference associated with the acquisition device. [0011] The advantages stated for the imaging system are transferable to the imaging method according to the second aspect of the present disclosure. The various embodiments of the imaging method according to the second aspect of the present description are combinable with each other. Different features and embodiments of the various aspects of the present disclosure may also be combined with each other. BRIEF DESCRIPTION OF THE FIGURES Other advantages and characteristics of the imaging technique presented above will appear on reading the detailed description below, made with reference to the figures in which: FIG. 1 (already described) is an example of an image, called FFOCT image, of a biological tissue sample (rat liver) having collagen fibers produced using a full-field OCT imaging method according to the prior art; such a method will be named subsequently FFOCT imaging method; FIG. 2 is a block diagram of an embodiment of an imaging system according to the present description; FIG. 3 is a flowchart of an embodiment of an imaging method according to the present description; this method will be named subsequently DC-FFOCT imaging method, - Figure 4 is an example of an image, called DC-FFOCT image, of the same sample as that shown in Figure 1, in the same place, in the same coherence slice, obtained by using a DC-FFOCT imaging method according to the present description; FIGS. 5A and 5B show intensity variation curves of the signals acquired in an exemplary 2-point DC-FFOCT imaging method of the sample, an image of which is shown in FIG. 4, the 2 points forming part of FIG. areas respectively showing absence of intracellular motion and presence of intracellular motion; FIGS. 6A and 6B show the statistical distributions of the signal intensities whose variation curves are represented respectively in FIGS. 5A and 5B; Fig. 7 is a flowchart of an embodiment of an imaging method for generating a combined image; such a method will be named subsequently CDCFFOCT imaging method (for "Combined Dynamic Contrast FFOCT"); FIG. 8 is an example of an image, called the CDC-FFOCT image, resulting from a CDC-FFOCT imaging method as described with reference to FIG. 7 and representing the same sample as that represented in FIGS. 1 and 4, at the same place and in the same consistency slice. DETAILED DESCRIPTION Imaging System 10 An embodiment of an imaging system 20 adapted to perform imaging methods of a volume sample according to the present disclosure is shown schematically in FIG. imaging 20 comprises an interference device 200, an acquisition device 208 and at least one processing unit 220. [0012] The interference device 200 is adapted to produce optical interferences between, on the one hand, reference waves obtained by reflection of the light emitted by a spatially incoherent light source 201 and of short coherence length by each elementary surface of the light. a reflection surface 205 of a reference arm of the interference device and, on the other hand, object waves obtained by backscattering the light emitted by the same source by each voxel of a slice of a sample 206 in depth in the sample, the sample 206 being disposed on an object arm of the interference device, said voxel and said elementary surface corresponding to the same point of the imaging field. The light source 201 is an incoherent source or of short coherence length (in practice, in a range of 1 to 20 micrometers), for example a halogen lamp or an LED. [0013] According to one or more exemplary embodiments, the light source 201 may be part of the imaging system 20, as in the example of FIG. 2, or may be an element external to the imaging system, the imaging system being adapted to work with light waves emitted by the source. The acquisition device 208 enables the acquisition of at least one two-dimensional interferometric signal resulting from interference between the reference waves and the object waves. The processing unit 220 is configured to perform at least one step of processing at least one two-dimensional interferometric signal acquired by the acquisition device 208 and / or at least one image generation step in accordance with at least one of the imaging methods according to the present description, to generate at least one image of the sample slice. [0014] In one embodiment, the processing unit 220 is a computing device comprising a first memory CM1 (not shown) for storing digital images, a second memory CM2 (not shown) for storing instructions. program as well as a data processor, able to execute program instructions stored in this second memory CM2, in particular to control the execution of at least one step of processing at least one two-dimensional interferometric signal acquired by the device 208 and / or at least one image calculation step according to at least one of the imaging methods according to the present description. The processing unit can also be implemented as an integrated circuit, comprising electronic components adapted to implement the function or functions described in this document for the processing unit. The processing unit 220 may also be implemented by one or more physically distinct devices. The acquisition device 208 is for example an image sensor, of the CCD (Charge-Coupled Device) or CMOS (Complementarity metal-oxide-semiconductor) type of camera. This acquisition device is capable of acquiring images at a high rate, for example at a frequency of 100 Hz. Depending on the dynamics of the sample studied, and more specifically the dynamics of the movements 15 within the sample, it will be possible to use cameras operating from a few Hz up to a few KHz. According to one embodiment, the interferometer 200 comprises a beam splitter element 202, for example a non-polarizing splitter cube, making it possible to form two arms. In one of the arms, which will be called "reference arm", is the plane reflection surface 205, for example a mirror. The other arm, which will subsequently be called "object arm", is intended to receive, in operation, the sample 206 volume and diffusing, for a slice of which it is desired to produce a tomographic image at least one depth according to the one of the methods of the present description. In the example of FIG. 2, the interferometer is of the Linnik interferometer type and comprises two identical microscope objectives 203, 204 arranged on each of the arms. The reflection surface 205 is thus at the focus of the objective 204 of the reference arm and the sample 206 is intended to be positioned at the focus of the objective 203 of the object arm. Other types of interferometers can be envisaged for carrying out the methods according to the present description, and in particular interferometers of Michelson, Mirau, Fizeau, etc. type. [0015] At the output of the interferometer 200 is placed an optics 207, for example an achromatic doublet, the focal length of which is adapted to allow an adequate sampling of the sample 206 by the acquisition device 208, and which makes it possible to combine the planes situated to the homes of the two objectives in the same plane at the output of the interference device. The acquisition device 208 is placed in the latter plane in order to acquire the interference signals produced by the interference device. In order not to limit the resolution allowed by the microscope objectives 203 and 204, the choice of the focal length of the optics 207 will be made in line with the Shannon criterion. The focal length of the optics 207 is for example a few hundred pixels. millimeters, typically 300 mm. [0016] Glass slides 209, 210 are, if necessary, provided on each of the arms to compensate for the dispersion. Since the light source 201 has a short coherence length, interference between the light reflected by the reflection surface 205 (reference wave) and the backscattered light by the sample 206 occurs only when the optical paths in both arms are equal, to the length of coherence close. Thus, interference occurs between the reference wave and the light backscattered by each voxel of a slice located in a plane perpendicular to the optical axis of the object arm, at a given depth of the sample, called a coherence slice. , a voxel being an elementary volume defined in the coherency slice. The light backscattered by each voxel is representative of the amplitude of the coherent sum of the waves backscattered by all the scattering elementary structures present in this voxel. The interferometric signals resulting from the optical interferences between the reference waves and the waves backscattered by the different voxels are acquired in parallel at an instant t by the acquisition device 208. This results in an interferometric image S corresponding to the state of the signal. interference at a given instant t of the coherence slice. An image element or interferometric image pixel located at a given position (x, y), defined relative to a two-dimensional coordinate system associated with the acquisition device 208, has a value S (x, y, t) corresponding to the intensity of the interferometric signal, acquired at time t at the (x, y) position, resulting from the interference between the backscattered wave by the corresponding position voxel in the sample and the reflected reference wave 20 by an elementary surface of the reflection surface 205 of the corresponding position reference arm. More precisely, the luminous intensity S (x, y, t) of the interferometric signal acquired by the acquisition device 208 at the position (x, y) and at the instant t, can be expressed in the form: S (x, y, t) = - (RA (x, y) + RB (x, y, t) + Rref + 2. IRB (x, y, t) Rref cos Ø (x, y, t) Where: S0 is the light intensity of the incident wave at the input of the interferometer; RA (x, y) is an equivalent reflection coefficient, proportional to the intensity of backscattered light by the non-interfering sample, ie backscattered light by sample structures located out of the consistency slice; RB (x, y, t) is an equivalent reflection coefficient, proportional to the intensity of light backscattered by a voxel of sample 206 in the coherence slice, at a given position (x, y) and at a given instant t; R'f is the reflection coefficient of the reference surface (reflection surface 205); 0 (x, .12.0) is the relative phase between the reference wave and the backscattered wave by the voxel of the sample at (x, y) and at time t. processing 220 is configured to generate an image of the sample 206 from at least two-dimensional interferometric signal obtained by the acquisition device 208. [0017] Various methods of using this imaging system and generating images from two-dimensional interferometric signals produced by this imaging system are described in more detail below. DC-FFOCT imaging method The main steps of an embodiment of an imaging method according to the present description, named DC-FFOCT (for "Dynamic Contrast FFOCT"), are described with reference to FIG. This method applies in particular to intracellular imaging and can be implemented by the imaging system 20. In an initial step 300, the imaging method DC-FFOCT is initialized and a sample 206 is placed in the object arm of the interference device 200 at a position to analyze a first sample slice. This first slice is the current slice for the first execution of steps 310 to 330 described below. In step 310, a time sequence of N two-dimensional interferometric signals of the current slice of the sample 206 is acquired by the acquisition device 208 with a fixed operating difference between the object arm and the reference arm. In particular, no variation in the path difference is produced in this case, unlike what is done for a full field OCT imaging method according to the prior art. According to one embodiment, the operating difference is maintained fixed by maintaining at a fixed position both the reflection surface in the reference arm and the sample 206 in the arm 20 object of the interference device 200. absence of variation of the difference in operation, the analysis of the interference signals obtained for a sample slice makes it possible to obtain a two-dimensional interferometric signal clearly showing the internal motions to the sample 206 for the structures present in the sample. this slice of the sample. The acquisition time of the sequence of two-dimensional interferometric signals and / or the acquisition frequency (and therefore the number of two-dimensional interferometric signals) can be chosen in a manner adapted to the characteristics of the cellular or intracellular movements to be highlighted. For example, the faster these movements are, the higher the acquisition frequency will be selected so as to significantly sample the different intensities of the interferometric signal at a given location. On the other hand, for slower movements, a lower frequency and a longer acquisition time may be more suitable. The number N is typically chosen between 102 and 104, for example around 1000, for an acquisition frequency of between 100 and 1000 images per second, typically 100 images per second, ie an acquisition over a period of between 0.1 and 100 seconds, typically ten seconds. These parameters may vary according to the targeted applications: according to the nature of the cells, the different pathologies for which the images are to be generated or the wavelength used. With cameras working from a few Hz to kHz we obtain a sampling adapted to different applications. [0018] It results from the acquisition of the N two-dimensional interferometric signals (step 310) N interferometric images SNi (x, y), where SNi (x, y) is the intensity of the signal acquired at the position (x, y) by the detection device, i being an integer ranging from 1 to N and (x, y) representing the position, relative to a two-dimensional reference associated with the acquisition device 208. These interferometric images are stored in the first accessible memory CM1 by the image processing unit 220 or are transmitted to the processing unit 220. In accordance with equation (1) above, each light intensity SNi (x, y) of the interferometric signal acquired at the position (x, y) at time t = ti is of the form: S SN; (x, y) = - (RA (x, y) + RB; (x, y) + Rref + 21 / RB1 (x , y) Rref cos 0; (x, y)) (2) where S0, RA (x, y), R'f have been defined in equation (1) above, RB; (x, y) is the reflection coefficient proportional to the intensity a backscattered light at time t i by a voxel of sample 206 at a given position (x, y) in the coherence slice, and q5 (x, Y) is the relative phase between the reference wave and the wave backscattered by the voxel of the sample at the position (x, y) at time ti. [0019] In practice, the applicants have shown that movements of the diffusers along an axis parallel to the optical axis of the object arm ("vertical" movement) result in variations of the relative phase Ø (x, Y) between the reference and the backscattered wave while movements of the diffusers in a plane perpendicular to the optical axis of the object arm ("horizontal" motion) result in variations of the reflection coefficient RB (x, y) proportional to the backscattered light intensity . In practice, the intensity fluctuations between the light intensities SNi (x, y) result from both "horizontal" and "vertical" movements. The processing unit 220 of the imaging system is then used to calculate an image, called dynamic contrast image or DC-FFOCT image, representative of temporal variations in intensity between these N two-dimensional interferometric signals acquired by the device 25. Acquisition 208. The computation of this dynamic contrast image comprises steps 320 and 330 below. In step 320, for each position (x, y) possible for a pixel in an interferometric image acquired by the acquisition device 208, is calculated a value of at least one parameter VN (x, y) representative of the temporal variations in intensity of two-dimensional interferometric signals SNi (x, y) acquired at a corresponding position (x, y) during step 310 for the current slice of the sample 206. Time variations of signal intensities two-dimensional interferometric SNi (x, y) are due in particular to temporal variations of the coefficient RBi (x, y) and of the relative phase 0, (x, y), for example due to the movements of diffusers, such as certain intracellular elements. in the case of a cell, inside a voxel, the other terms being substantially constant over the acquisition period of the N interferometric signals. [0020] In one embodiment, the parameter VN (x, y) is representative of the characteristics of the statistical distribution of the intensities of the acquired two-dimensional interferometric signals, in particular of the time dispersion of these intensities. This parameter VN (x, y) is for example a function of the standard deviation of the intensities SN i (x, y), i varying from 1 to N in this succession of N interferometric images. For example: 1 (x, y) = LN (SN (xy) MN (x, y)) 2 N i = i 1 N with MN (x, y) = -1 SN, (x, y) N1 = 1 This parameter VN (x, y) can also be an average value of standard deviations calculated respectively on different subsets of the N interferometric images. [0021] Other statistical or mathematical parameters making it possible to characterize the amplitude of the temporal variations of intensity of the pixels and / or the dispersion of these intensities are usable such as the variance (standard deviation squared), the interquartile difference, the range (difference between highest and lowest value), etc. The parameter VN (x, y) can also be obtained by various spatio-temporal signal processing methods: autocorrelation coefficient of the SN (x, y) time signals obtained for each pixel (x, y), or decomposition in singular values for example. In step 330 is calculated a dynamic contrast image IB or DC-FFOCT image representative of temporal intensity variations between the N two-dimensional interferometric signals of the current slice of the sample 206. [0022] In one embodiment, each IB (x, y) pixel of the image IB, located at a given position (x, y), represents the value calculated for that given position for the parameter chosen in step 320. For example, a pixel of the image IB which is located at a given position (x, y) and / or at least one component of this pixel, defined relative to a color representation space, has a value which is a function of the value calculated for the parameter considered for the corresponding position (x, y) from the intensities SN i (x, y), for i = 1 to N, of the N interferometric signals acquired in step 310. For example, when the color representation space used for image IB is a representation on gray levels, the value of pixel IB (x, y) may be equal to or function of the value VN (x, y) at a factor of scaling up so, for example, to obtain a level of gray 30 encoded on a given number of bits s. According to another example, when the color representation space used for the image IB is a representation according to the RGB (Red, Green, Blue) color representation space, at least one of the components R, G or B of IB (x, y) position (x, y) pixel in the image IB will be equal to or function of VN (x, y) at a scaling factor close so, for example, to obtain 35 a colorimetric component coded on a given number of bits. (3) (4) In another example, when the color representation space used for the image IB is a representation according to the YUV (luminance / chrominance) color representation space, at least one of the Y, U or V components of the pixel IB (x, y) of position (x, y) in the image IB will be equal to or function of VN (x, y) at a scaling factor of about for example, to obtain a component coded on a given number of bits. The dynamic image IB thus produced can then be displayed on a display screen 230 connected to the image processing unit. FIG. 4 shows an example of a dynamic image IB produced by the DC-FFOCT imaging method for the same coherence slice of the sample as that represented on the image of FIG. 1. The pixels of this image are coded on gray levels. The image corresponds to a sample area of size 800 μm per 800 μm extracted from a rat liver. In this image, we distinguish fine structures, including individual cells, which were not present in Figure 1. We distinguish (see the dotted lines dotted line) not only liver cells C1, C2 but also their nucleus ( circular zone Neither darker in the middle of the liver cell Cl). Such details were not present or not visible in the image of FIG. 1, obtained by a prior art full-field OCT imaging technique for the same sample slice. The example of FIG. 4 illustrates in particular the interest of the method according to the present description for cellular and intracellular imaging, and notably demonstrates the substantial improvement in the contrast of cellular and intracellular structures. At the microscopic scale, particularly at the cellular or intracellular level, different movements occur: fluid flow or circulation, but also intracellular movements of various organelles (or "organelles") such as cell nuclei, mitochondria, lysosomes, etc. This is true not only for in vivo tissues of humans, live animals or plants, but also for freshly collected samples from humans, live animals or plants. For example, the complete death (apoptosis) of a human cell from a sample occurs only 24 to 48 hours after the sample is taken from the living person. During this time, life continues and movements are detectable, especially inside the cells. The biological tissues furthermore exhibit optical backscattering properties such that about a few thousandths to a few millionths of the power of the incident light beam are backscattered by these tissues because of the presence in these tissues of various structures, referred to in the following. the description "diffusers" whose size is less than or close to that of the wavelength of the light beam. The movements of the fine structures present in the tissue such as the intracellular structures produce phase differences in the backscattered beam and, through interference, temporal variations in the light intensity of the signal acquired by the interference microscope. These time variations of intensities can be detected for a given period of time, by acquisition at a given frequency, of a succession of two-dimensional interferometric signals of the same sample slice. By taking two-dimensional interferometric signals with a fixed path difference between the object arm and the reference arm, the applicants have shown that the two-dimensional interferometric signals resulting from the interference states occurring can be sampled at the desired rate. successively in a slice of the sample and make an analysis of the temporal variations of intensity between these two-dimensional interferometric signals in order to produce an image of the tissue regions of the sample in which the movements which are causing the these temporal variations of intensity. By way of example, FIGS. 5A and 5B illustrate the temporal variations of the intensities of the two-dimensional interferometric signals acquired by the acquisition device 208 for two positions respectively corresponding to two positions (xi, yi) and (x2, y2) in Fig. 6A and 6B show the statistical distributions of the signal intensities whose variation curves are shown respectively in Figs. 5A and 5B. For a first position (xi, yi) in a two-dimensional coordinate system associated with the acquisition device 208, it is observed in FIG. 5A that the intensities of the signals recorded on 12 bits generally vary between 3445 and 3450 (a variation of 5 units approximately) with a time and average distribution substantially stable over time. At this position (xi, yi), which is for example a position located in the zone S2 shown in Figure 4, centered on collagen fibers, the absence of movement is deduced. For a second position (x2, y2), which is for example a position located in the area S3 shown in FIG. 4, inside a hepatic cell, it is observed in FIG. 5B that the intensities of the signals vary. globally between 3231 and 3239 (a variation of about 9 units) with greater time dispersion and a significantly variable average over time. Figure 5B is representative in contrast to a position (x2, y2) where there is intracellular movement. By comparison of FIGS. 6A and 6B, a greater temporal dispersion in the distribution, represented in FIG. 6B, of the intensities of the signal, represented in FIG. 5B, with respect to the time dispersion shown in FIG. signal, shown in Figure 5A. After calculating a DC-FFOCT image for a given consistency slice, step 340 is determined if it is desired to acquire a DC-FFOCT image for another sample slice. If so, the position of the sample along the optical axis of the object arm is modified during step 340 and steps 310 to 330 are repeated for this second portion of the sample that becomes the current slice. Otherwise, the process ends. A three-dimensional image of the sample 206 can thus be obtained by varying the position of the sample 206, and thus the position at depth in the sample 206 of the coherence slice for which there is equality of optical paths in the two arms of the interference device, and repeating the interferometric signal acquisition and image generation procedure for this coherence slice. [0023] The steps 320 and 330 for generating the dynamic image IB can also be performed after step 340, provided that the N interferometric images acquired during step 310 are stored in memory for each sample slice. The image acquisition steps 310 for a sample slice may also be performed in parallel with the dynamic image generation steps 320 and 330 for a previous slice of the sample 206. CDC imaging method FFOCT The steps of one embodiment of the combined image generation method, also referred to as the CDC-FFOCT (for "Combined Dynamic Contrast FFOCT") imaging method, are described with reference to FIGS. 7 and 8. The CDC-FFOCT imaging system is implemented using the imaging system 20 and a modulation device 211, adapted to vary the relative phase between the reference wave and the backscattered wave. According to one example, the modulation device makes it possible to generate a relative displacement of the reflection surface with respect to the sample, for example by means of a piezoelectric plate for moving the reflection surface; this results in a variation of the operating difference between the object arm and the reference arm of the interference device 200, as will be described in more detail below. In an initial step 700, the CDC-FFOCT imaging method is initialized and a sample 206 is placed in the object arm of the interference device 200 at a position to analyze a first sample slice. This first slice is the current slice for the first execution of steps 710 to 730 described below. Step 710 is identical to step 310 described for the DC-FFOCT imaging method. In particular, a temporal succession of N two-dimensional interferometric signals of the current slice of the sample 206 is acquired by the acquisition device 208 with a fixed operating difference between the object arm and the reference arm. In step 720, according to an exemplary embodiment, a succession of P two-dimensional interferometric signals of the current slice of the sample 206 is acquired by the acquisition device 208 at different values of the operating difference between the object arm and the reference arm of the interference device 200. The variation of the operating difference is effected, for example by moving, parallel to the optical axis of the reference arm, the reference surface 205 by means of the plate piezoelectric 211, for example a displacement according to a sinusoidal function resulting in a modulation of the difference in operation. Synchronization with the image capture performed by the acquisition device 208 may then allow interferometric images to be recorded for predefined values of the phase of the two-dimensional interferometric signals. From the acquisition of the interferometric signals in step 720 result P interferometric images, denoted SPJ, where SP, (x, y) is the intensity of the interferometric signal acquired at the position (x, y) and for a relative phase between the reference wave and the backscattered wave (p = (j), j being an integer ranging from 1 to P, P> 2, and (x, y) representing the position relative to a two-dimensional landmark associated with the acquisition device 208. These interferometric images SP (x, y) are stored in a memory CM1 accessible by the processing unit 220 or are transmitted to the processing unit 220. [0024] The luminous intensity SP, (x, y) of the interference signals acquired by the acquisition device 208 for the position (x, y) and for a relative phase (p = (p, can be written as form: SP i (x, y) = (RA (x, y) + RB (x, y) + R'f + 211RB (x, y) R'f cos Oi (x, y)) (5) where So, RA (x, y), R'f have been defined in equation (1) above, RB (x, y) is an equivalent reflection coefficient proportional to the light intensity backscattered by a voxel of sample 206 at a given position (x, y) in the coherence slice and ((x, y) is the relative phase between the reference wave and the backscattered wave by the voxel of the sample at the position (x, y) Each pixel value of the resulting full-field OCT image, corresponding to the coefficient RB (x, y), is obtained by a combination of intensity values SP, (x, y) according to known methods. . [0025] In the case of a variation of the phase resulting from a temporal modulation of the relative displacement of the reflection surface with respect to the sample for example, the phase Ø (x, y) corresponds to the relative phase between the reference wave and the backscattered wave by the voxel of the sample at the (x, y) position and at a time t = t. However, the applicants have shown that the temporal fluctuations of the coefficient RB (x, y) are very small compared with the desired coefficient RB (x, y) (typically a factor of 1000). It is therefore reasonable to approximate RB (x, y) constant as a function of time. In one embodiment, P = 2 interferometric images are acquired for two values of the difference in operation corresponding to a phase shift of 7 (between the relative phases ((x, y) .Thus, a first interferometric image is recorded. for a first value of the relative phase 01 (x, Y) and a second interferometric image for a second value of the relative phase, 02 (x, Y) = 01 (x, Y) + 7 (e.g. of the resulting full-field OCT image, corresponding to the coefficient RB (x, y) is obtained by calculating the difference between two intensity values SPI (x, y) and SP2 (x, y) respectively obtained for two values relative phase 01 (x, Y) and 02 (x, y) and then calculating the absolute value of the difference between the two values SPI (x, y) and SP2 (x, y). acquisition of P two-dimensional interferometric signals corresponds to what is used in an imaging process ie FFOCT according to the prior Art 3034858 18. Any other method of calculating RB (x, y) according to a full-field FFOCT imaging method according to the prior art can be used. In particular, it is possible to proceed, in a known manner, to a modulation of the difference in operation and to a synchronized detection by the detection device in order to acquire P interferometric signals at P values of the difference in operation to determine the FFOCT image. . It is also possible to integrate the interferometric signal over periods of modulation period, for example over 4 periods, to obtain the amplitude and the phase of the interferometric signal. It is also possible to acquire at the same time (at the same time) several interferometric images corresponding to different relative phase values, for example two relative phase values separated by 7C, for example by spatial separation of 2 beams. out of phase of 7C. In step 730, the processing unit 220 of the imaging system is then used to calculate an image, called the combined image or CDC-FFOCT image, also representative of temporal variations in intensity between the N two-dimensional interferometric signals acquired. by the acquisition device 208 in step 710. The combined image IC is calculated from, on the one hand, the intensities of the P two-dimensional interferometric signals acquired by the acquisition device 208 in step 720 and, on the other hand, temporal variations in intensities between the N two-dimensional interferometric signals acquired by the acquisition device 208 during the step 710. [0026] In a first embodiment, a first intermediate image IB, or dynamic contrast image, is calculated from the temporal intensity variations between the N two-dimensional interferometric signals acquired by the acquisition device 208 at step 710 and a second intermediate image IA, called a tomographic image, is calculated from the intensities of the P two-dimensional interferometric signals acquired by the acquisition device 208 in step 720. [0027] The combined image IC is then calculated by pixel-to-pixel combination of these two intermediate images IA and IB. By pixel-to-pixel combination is meant here that a pixel AI (x, y) of position (x, y) in the image IA is combined with the pixel IB (x, y) of the same position (x, y) in the image IB. In a second embodiment, no intermediate image is generated and the combined image IC is calculated by directly computing the pixels of the combined image from the intensities of the acquired interferometric signals. In particular, for a given position pixel (x, y) in the combined image IC, a pixel value IC (x, y) is calculated as a function, on the one hand, of the intensities of the P two-dimensional interferometric signals acquired in a corresponding position point in a two-dimensional coordinate system associated with the acquisition device and, on the other hand, temporal variations in the intensity of the N two-dimensional interferometric signals acquired at a corresponding position point in a two-dimensional coordinate system associated with the device. acquisition. In the first as in the second embodiment, a given position (x, y) pixel value IC (x, y) in the combined image is computed by combining two pixel values IA (x, y) and IB (x, y), where: the value IA (x, y) is calculated from the intensities of the P two-dimensional interferometric signals acquired at different values of the path difference for the (x, y) position and represents the coefficient RB (x, y) proportional to the intensity of the backscattered wave by a given voxel of the coherence slice of the sample, and the value IB (x, y) is calculated from the intensities of the N two-dimensional interferometric signals acquired at a fixed-path difference and represents the temporal variations of the intensities determined between the N two-dimensional interferometric signals acquired at the corresponding position point in a reference frame associated with the acquisition device. The value IB (x, y) is calculated according to the method described for step 330 for example. In particular, the value IB (x, y) can be calculated as a function of the value of at least one parameter VN (x, y) representative of the temporal variations of the intensities SNi (x, y) between the N interferometric signals. It is assumed here, for the combination of pixel values, that the signal acquisition performed at steps 710 and 720 has been performed so that the signal SNi (x, y) acquired for the (x, y) position at step 710 comes from the same voxel of the sample 206 as the signal SP, (x, y) acquired for the position (x, y) during the step 720: this will be the case if neither the sample 206, or the acquisition device 208 are not moved during the execution of the steps 710 and 720 and the acquisition device 208 acquires images of the same resolution in the steps 710 and 720 and using the same two-dimensional mark. [0028] The combination of pixel value IA (x, y) and IB (x, y) is performed so that the information contained respectively in each of the images IA (x, y) and IB (x, y) for a Given voxel of sample 206 are present in the resulting combined image IC. This combination can be performed so that a user viewing the resulting IC image is able to discriminate the structures highlighted in image IA (tomographic image) of the structural fluctuations highlighted in image IB (FIG. dynamic contrast image). In particular, the value IC (x, y) of a pixel of the combined image which is located at a given position (x, y) in the image IC is a function of the value IA (x, y) of the pixel located at this same position (x, y) in the tomographic image IA and function of the value calculated for this position (x, y) for the parameter used in step 720. The pixel IC (x, y) can present a defined component relative to a color representation space. Thus a user will be able to visualize for a given location of the sample 206, an image containing both the information from a known incoherent light field interference microscopy imaging method and the information on the movements of the fine structures of sample 206 demonstrated by the DC-FFOCT imaging method according to the present disclosure. Different methods of combining these values IA (x, y) and IB (x, y) are possible and the methods described below are given as non-limiting examples. [0029] A first combination method consists in assigning, for each position (x, y) in an area of the combined image, the pixel value IA (x, y) to a first colorimetric component of a pixel IC (x , y) of the image IC and the pixel value IB (x, y) to a second color component of this pixel IC (x, y). For example, using a representation according to the RGB color representation space, the R component of the position pixel (x, y) IC (x, y) in the image IC will be equal to the pixel value IA (x , y) and the component B (Blue) of the pixel IC (x, y) of position (x, y) in the image IC will be equal to the pixel value IB (x, y) or vice versa. A second combination method consists in defining two correspondence tables TA, TB (called Look-up table or LUT in the English terminology), a first correspondence table TA for the pixel values IA (x, y) and a second correspondence table TB for the pixel values IB (x, y). Each correspondence table TA, respectively TB, is used to calculate for each position (x, y) possible in an area of the combined image, a pixel value TA (x, y), respectively TB (x, y), resulting of transforming the pixel value IA (x, y), respectively IB (x, y) by the correspondence table TA, respectively TB. The value of a pixel IC (x, y) at a position (x, y) in the image IC is then obtained by a combination function of the pixel value TA (x, y) and the value of pixel TB (x, y). This combination function is for example a "logical OR" of the pixel values TA (x, y) and TB (x, y), or a function of adding these two pixels or calculating mean value or sum weighted by weighting coefficients. [0030] The resulting image IC can be a color image or a gray level image according to the selected combination method. In step 730, the combined image IC thus generated can then be displayed on a display screen 230 connected to the image processing unit. FIG. 8 shows an example of an IC image resulting from the combination of the image IA shown in FIG. 1 with the image IB represented in FIG. 4. The method used for the combination is for example the second method such as FIG. as described above. Each of these images represents the same area and the same sample slice of size 800 μm by 800 μm as the area and slice imaged in Figs. 1A and 1B (biological tissue of rat liver). By comparison of the image IA of FIG. 1 with the image IC of FIG. 8, it is clear that the image IC is much richer in fine structures than the image IA but nevertheless comprises the information of FIG. image IA of FIG. 1. These fine structures of the image IC actually correspond to fluctuations detected in the slice of the sample concerned: the fine structures of the image of FIG. 8 are those originating from the image IB (Figure 4) which has been combined with the image IA (Figure 1). By way of example, in image IA of FIG. 1, a large number of collagen fibers, a collagen fiber concentration zone S 2 and a large capillary in the Si zone can be distinguished, but no cells are distinguished. individual. Conversely, in the image IB of FIG. 4, individual cells are distinguished in the zones C1, C2 or S3, but not the collagen fibers of the zone S2 or the thick capillary of the zone Si. IC image, we distinguish both the collagen fibers (S2 area), the large capillary (Si area) and individual cells (Cl, C2 or S3 areas). Thanks to the combined image IC, a user can have a single image where there is a lot of visual information on the structures present in the portion of the sample, as well as on the fluctuations, representative of the cellular movements and activities or intracellular, of this same sample slice. Once a combined CDC-FFOCT image has been calculated for a given consistency slice of the sample, it is determined in step 740 whether it is desired to generate a CDC-FFOCT combined image for another sample slice. If so, the position of the sample along the optical axis of the object arm is changed in step 740 and steps 710 to 730 are repeated for that second portion of the sample that becomes the current slice. Otherwise, the process ends. A three-dimensional image of the sample 206 can thus be obtained by varying the position of the sample 206, and thus the position at depth in the sample 206 of the coherence wafer for which there is equality of optical paths in the two arms of the interference device, and repeating the interferometric signal acquisition and image generation procedure for this coherence slice. The DC-FFOCT and CDC-FFOCT imaging methods described above apply in particular to the acquisition of images of cells or intracellular images, and may be particularly useful for the diagnosis of cancer where the examination of cellular details is necessary. In particular, the dynamic origin of the contrast generated by the DC-FFOCT and CDC-FFOCT imaging methods can allow, in addition to the visualization of the cellular details, to visualize the cell activity and to distinguish the metabolic state of a cell. cell (overactivity, cell death, necrosis). They can also be applied in general in all situations where the different voxels of the coherence section of the sample to be analyzed present movements, activities, functions, mechanisms or physico-chemical properties causing a temporal variation of the luminous intensity reflected by these voxels. They are also applicable to any samples, whether these samples are organic or not. 30
权利要求:
Claims (12) [0001] REVENDICATIONS1. An imaging system (20) by full-field interference microscopy of a scattering volume sample (206) comprising: - an interference device (200) comprising an object arm for receiving the sample and a reference arm on which is arranged a reflection surface (205), the interference device being adapted to produce, when the sample is disposed on the object arm of the interference device, at each point of an imaging field, an interference between a reference wave obtained by reflection of incident light waves on an elementary surface of the reflection surface (205) corresponding to said point of the imaging field and an object wave obtained by backscattering of incident light waves by a voxel of a slice of the sample at a given depth, said voxel corresponding to said point of the imaging field, - an acquisition device (208) adapted to acquire, unlike fixed step e between the object arm and the reference arm, a temporal succession of N two-dimensional interferometric signals resulting from the interferences produced at each point of the imaging field, - a processing unit (220) configured to calculate an image (IB, IC) representative of temporal variations of intensity between said N two-dimensional interferometric signals. [0002] An imaging system according to claim 1 wherein the processing unit (220) is configured to calculate said image by calculating, for each pixel of given position in said image, a pixel value as a function of a value of a parameter representative of the temporal variations in intensity of the N two-dimensional interferometric signals acquired at a corresponding position point in a two-dimensional coordinate system associated with the acquisition device. [0003] An imaging system according to claim 2, wherein said parameter is representative of the time dispersion of the intensities of the N two-dimensional interferometric signals. [0004] 4. Imaging system according to any one of claims 2 or 3 wherein said pixel has at least one component, defined relative to a color representation space, whose value is a function of the value of said parameter. [0005] An imaging system according to any one of claims 1 to 4 wherein the acquisition device is configured to further acquire two-dimensional interferometric signals for the sample slice at different values of the difference of operation. between the two arms of the interference device, the processing unit is configured to calculate an image (IC), called the combined image, from the intensities of the P two-dimensional interferometric signals and said temporal variations of intensity. [0006] An imaging system according to claim 5 wherein the processing unit (220) is configured to calculate said combined image by calculating, for a given position pixel in said combined image, a pixel value based on on the one hand, intensities of the P two-dimensional interferometric signals acquired at a corresponding position point in a two-dimensional coordinate system associated with the acquisition device and, on the other hand, temporal variations of intensity of the N two-dimensional interferometric signals acquired at a point corresponding position in a two-dimensional coordinate system associated with the acquisition device. 15 [0007] 7. A method of full-field interference microscopy imaging of a scattering volume sample (206) comprising - the placing of the sample on an interference object arm, - the production, by means of the device interference (200), for each point of an imaging field, of interference between a reference wave obtained by reflection of incident light waves on an elementary surface of a reflection surface (205) of a reference arm of the interference device, said elementary surface corresponding to said point of the imaging field, and an object wave obtained by backscattering of the incident wave by a voxel of a slice of the sample at a given depth , said voxel corresponding to said point of the imaging field, 25 - the acquisition, at a difference of fixed step between the object arm and the reference arm, of a temporal succession of N two-dimensional interferometric signals. t interference for each point of the imaging field, - the calculation of an image (IB, IC) representative of temporal variations of intensity between the N two-dimensional interferometric signals. 30 [0008] An imaging method according to claim 7, wherein the step of calculating the image comprises calculating, for each pixel of position given in said image, a pixel value as a function of a value of a parameter representative of the temporal variations in intensity of the N two-dimensional interferometric signals acquired at a corresponding position point in a two-dimensional coordinate system associated with the acquisition device. 3034858 24 [0009] The imaging method according to claim 8, wherein said parameter is representative of the time dispersion of said intensities of the N two-dimensional interferometric signals. [0010] 10. An imaging method according to any one of claims 8 or 9, wherein said pixel 5 has at least one component, defined relative to a color representation space, whose value is a function of the value of said parameter. [0011] An imaging method according to any one of claims 7 to 10, further comprising: acquiring at least P two-dimensional interferometric signals for the sample wafer at different values of the difference in gait between the two arm of the interference device, the calculation of an image (IC), called the combined image, from the intensities of the P two-dimensional interferometric signals and of said temporal variations of intensity. 15 [0012] The imaging method according to claim 11, wherein the calculation of the combined image comprises, for each given pixel of the combined image, the calculation of a pixel value as a function, on the one hand, intensities of the P two-dimensional interferometric signals acquired at a corresponding position point in a two-dimensional coordinate system associated with the acquisition device and, secondly, temporal intensity variations of the N two-dimensional interferometric signals acquired at a corresponding position point. in a two-dimensional coordinate system associated with the acquisition device.
类似技术:
公开号 | 公开日 | 专利标题 FR3034858A1|2016-10-14|METHOD AND SYSTEM FOR FULL FIELD INTERFERENTIAL MICROSCOPY IMAGING EP2615967B1|2017-08-16|Optical tissue sectioning using full field optical coherence tomography Kim et al.2014|Common-path diffraction optical tomography for investigation of three-dimensional structures and dynamics of biological cells Dubois2016|Handbook of full-field optical coherence microscopy: Technology and applications JP4454030B2|2010-04-21|Image processing method for three-dimensional optical tomographic image US9255785B2|2016-02-09|Method and device for high resolution full field interference microscopy EP3084345B1|2020-03-25|Device and process of optical tomography Stremplewski et al.2019|In vivo volumetric imaging by crosstalk-free full-field OCT Thouvenin et al.2017|En face coherence microscopy CA2366763C|2009-03-17|High resolution device for observing a body Hosseini et al.2015|Scanning color optical tomography | Ogien et al.2020|Dual-mode line-field confocal optical coherence tomography for ultrahigh-resolution vertical and horizontal section imaging of human skin in vivo Wangpraseurt et al.2017|In vivo imaging of coral tissue and skeleton with optical coherence tomography CA2882784C|2021-02-02|Three-dimensional optical coherence tomography apparatus and its application US9655521B2|2017-05-23|Combined reflectance confocal microscopy-optical coherence tomography system for imaging of biological tissue Leung et al.2020|Imaging intracellular motion with dynamic micro-optical coherence tomography Hu et al.2019|Quantitative phase imaging: principles and applications Mo et al.2009|Application of color digital holographic microscopy for analysis of stained tissue sections CN103335592A|2013-10-02|Double Lloyd's mirror digital holographic microscopic measurement method Žurauskas et al.2021|Simultaneous 4-phase-shifted full-field optical coherence microscopy Perrin2016|Development and characterization of an optical coherence tomography micro-system: Application to dermatology Pinkert et al.2020|Platform for quantitative multiscale imaging of tissue composition FR3050038A1|2017-10-13|METHOD AND DEVICE FOR FULL-FIELD INTERFERENTIAL MICROSCOPY IN INCOHERENT LIGHT Kim et al.2015|Conference 9330: Three-Dimensional and Multidimensional Microscopy: Image Acquisition and Processing XXII Nguyen2019|Light Sheet Reflectance Microscopy
同族专利:
公开号 | 公开日 CA2982211A1|2016-10-13| US20200233198A1|2020-07-23| US10627613B2|2020-04-21| JP6948311B2|2021-10-13| CN107743582A|2018-02-27| WO2016162521A1|2016-10-13| US20180120550A1|2018-05-03| EP3281053A1|2018-02-14| FR3034858B1|2017-05-26| JP2018517149A|2018-06-28|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20080049234A1|2006-08-23|2008-02-28|Peter Seitz|Colorimetric three-dimensional microscopy|CN110431465A|2017-04-07|2019-11-08|卡尔蔡司显微镜有限责任公司|For shooting and presenting the microscopie unit of the 3-D image of sample|US5459564A|1994-02-18|1995-10-17|Chivers; James T.|Apparatus and method for inspecting end faces of optical fibers and optical fiber connectors| US6690474B1|1996-02-12|2004-02-10|Massachusetts Institute Of Technology|Apparatus and methods for surface contour measurement| US6480285B1|1997-01-28|2002-11-12|Zetetic Institute|Multiple layer confocal interference microscopy using wavenumber domain reflectometry and background amplitude reduction and compensation| FR2817030B1|2000-11-17|2003-03-28|Centre Nat Rech Scient|METHOD AND DEVICE FOR MICROSCOPIC INTERFERENTIAL IMAGING OF A HIGH-THROUGHPUT OBJECT| US6721094B1|2001-03-05|2004-04-13|Sandia Corporation|Long working distance interference microscope| US7643153B2|2003-01-24|2010-01-05|The General Hospital Corporation|Apparatus and method for ranging and noise reduction of low coherence interferometry LCI and optical coherence tomography OCT signals by parallel detection of spectral bands| JP4414235B2|2002-03-14|2010-02-10|テイラー・ホブソン・リミテッド|Surface profiling apparatus and surface profile data creation method| US7623908B2|2003-01-24|2009-11-24|The Board Of Trustees Of The University Of Illinois|Nonlinear interferometric vibrational imaging| US7034271B1|2004-05-27|2006-04-25|Sandia Corporation|Long working distance incoherent interference microscope| JP4546209B2|2004-09-30|2010-09-15|株式会社ニデック|Ophthalmic equipment| EP1928305A1|2005-09-29|2008-06-11|General Hospital Corporation|Method and apparatus for optical imaging via spectral encoding| JP2007252475A|2006-03-22|2007-10-04|Fujifilm Corp|Optical tomographic imaging system, and image-quality adjusting method for optical tomographic image| DE102007019679A1|2006-11-06|2008-05-15|Carl Zeiss Surgical Gmbh|Surgical microscope with OCT system| FR2960291B1|2010-05-18|2015-04-03|Lltech Man|METHOD AND DEVICE FOR FULL-FIELD HIGH-RESOLUTION FIELD INTERFERENTIAL MICROSCOPY| WO2011160064A1|2010-06-17|2011-12-22|Purdue Research Foundation|Digital holographic method of measuring cellular activity and of using results to screen compounds| FR2962531B1|2010-07-08|2014-01-17|Lltech Inc|METHOD AND DEVICE FOR THREE-DIMENSIONAL IMAGING BY FULL FIELD INTERFERENTIAL MICROSCOPY| EP2615967B1|2010-09-17|2017-08-16|LLtech Management|Optical tissue sectioning using full field optical coherence tomography| EP2676123A4|2011-02-18|2016-01-20|Gen Hospital Corp|Laser speckle microrheometer for measuring mechanical properties of biological tissue| WO2013086527A1|2011-12-09|2013-06-13|Massachusetts Institute Of Technology|Systems and methods self-referenced quantitative phase microscopy| CN102954757A|2012-10-30|2013-03-06|哈尔滨工程大学|Microscopic interference detecting device based on synchronous carrier phase shift and detecting method of microscopic interference detecting device| CN103115583B|2013-01-29|2015-07-29|哈尔滨工业大学|Based on the Mirau fluorescence interference micro-measurement apparatus of stimulated radiation|US20190310451A1|2016-11-10|2019-10-10|The Trustees Of Columbia University In The City Of New York|Rapid High-Resolution Imaging Methods for Large Samples| CN108982433B|2017-06-05|2021-09-03|锐准医光股份有限公司|Optical layer cutting device adopting advanced optical interference microscopy| CN109091108B|2018-06-07|2021-09-03|南京理工大学|Phase filter search algorithm based on field of view subregion segmentation| CN110044848B|2019-04-30|2022-01-28|中国科学院重庆绿色智能技术研究院|Weak coherence tomography system and method based on Fizeau interference principle| WO2021207118A1|2020-04-06|2021-10-14|The General Hospital Corporation|System and method of dynamic micro-optical coherence tomography for mapping cellular functions| CN113109340A|2021-04-08|2021-07-13|中山大学中山眼科中心|Imaging system and imaging method for signal processing of planar interferometer|
法律状态:
2016-04-26| PLFP| Fee payment|Year of fee payment: 2 | 2016-10-14| PLSC| Publication of the preliminary search report|Effective date: 20161014 | 2017-04-24| PLFP| Fee payment|Year of fee payment: 3 | 2018-04-24| PLFP| Fee payment|Year of fee payment: 4 | 2019-04-24| PLFP| Fee payment|Year of fee payment: 5 | 2020-04-06| PLFP| Fee payment|Year of fee payment: 6 | 2021-04-29| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1553120A|FR3034858B1|2015-04-10|2015-04-10|METHOD AND SYSTEM FOR FULL FIELD INTERFERENTIAL MICROSCOPY IMAGING|FR1553120A| FR3034858B1|2015-04-10|2015-04-10|METHOD AND SYSTEM FOR FULL FIELD INTERFERENTIAL MICROSCOPY IMAGING| PCT/EP2016/057827| WO2016162521A1|2015-04-10|2016-04-08|Method and system for full-field interference microscopy imaging| CN201680034161.XA| CN107743582A|2015-04-10|2016-04-08|Method and system for whole audience interference micro-imaging| JP2018503821A| JP6948311B2|2015-04-10|2016-04-08|Full-field interference microscope imaging method and system| EP16719232.7A| EP3281053A1|2015-04-10|2016-04-08|Method and system for full-field interference microscopy imaging| CA2982211A| CA2982211A1|2015-04-10|2016-04-08|Method and system for full-field interference microscopy imaging| US15/565,449| US10627613B2|2015-04-10|2016-04-08|Method and system for full-field interference microscopy imaging| US16/837,031| US20200233198A1|2015-04-10|2020-04-01|Method and system for full-field interference microscopy imaging| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|