![]() ALERT ASSISTANCE FOR ULTRASONIC IMAGING IN DIAGNOSTIC MODE
专利摘要:
An alert assist method for an ultrasound scanning device comprises: scanning a patient with an ultrasound transducer of the ultrasound scanning device in a diagnostic mode in which the ultrasound transducer moves relative to the body. patient; applying computer aided detection by the ultrasonic scanner to each of a sequence of frames acquired by the scan; identifying an object in a first of the frames with the computer aided detection of the ultrasonic scanner; measuring, in response to the identification of the object in the first frame, a mechanical property of the object; and generating an image with an alert identifying the first frame and the measured mechanical property. 公开号:FR3048870A1 申请号:FR1752177 申请日:2017-03-17 公开日:2017-09-22 发明作者:Andrzej Milkowski;Lei Sui 申请人:Siemens Medical Solutions USA Inc; IPC主号:
专利说明:
BACKGROUND The present embodiments relate to ultrasound imaging. In ultrasound imaging, users "examine" organs for suspicious areas. An examination is typically performed by moving the transducer to scan different planes through an organ. The goal is to identify any suspicious objects by viewing the sequence of images created during the examination. Occasionally, the user moves the transducer or looks at the images faster, increasing the probability of missing an object. Image persistence and / or associated motion blur can / may contribute to the user missing a suspicious area. Users may reduce the speed of examination or repeatedly scan a region if the user is suspicious to increase sensitivity, but this requires time that may not be available. Either a suspicious area is not identified (lowering the sensitivity), or a sweeping time is increased to confirm the imaging results. For automated examinations, such as by a large volume scanner, the system scans the patient once to generate a sequence of images. The search for suspicious objects can be done when the examination of the patient is finished. As a result, the patient may have to return for another examination for a more detailed scan of any suspicious areas. This "reminder" approach can be inefficient and expensive. BRIEF SUMMARY As an introduction, the preferred embodiments described below include methods, instructions, and systems for alert assistance for an ultrasound scanner. Computer-assisted detection is applied while the patient is being scanned. The user can be notified of any detected objects, so that the user gathers more information where appropriate. An automated system can be configured to go back and scan any detected objects. Information is gathered as part of the workflow for this given patient examination on the basis of the detection. A mechanical property of the object is derived from the additional information, resulting in other information that can be used to avoid a reminder visit and / or increase sensitivity in diagnostic mode scans. In a first aspect, a method is provided for alert assistance for an ultrasonic scanner. A patient is scanned with an ultrasonic transducer of the diagnostic ultrasound scanner in which the ultrasound transducer moves relative to the patient. Computer-assisted detection is applied by the ultrasonic scanner to each of a sequence of frames acquired by the scan. The computer-assisted detection of the ultrasound scanner identifies an object in a first one of the frames. In response to identifying the object in the first frame, a mechanical property of the object is measured. An image is generated with an alert identifying the first frame and the measured mechanical property. According to the variants, the method may comprise one or more of the following steps or characteristics: the scanning comprises scanning with the movement of the ultrasonic transducer controlled by a motor; the scanning comprises the scanning with the movement of the ultrasonic transducer carried out manually by a user; the scanning comprises the robotic scanning and the measurement comprises the robotic measurement; the application includes the application of the computer-assisted detection as a machine learning detector of the usable object for processing the frames of the sequence while the scanning is performed; the identification comprises identifying a tumor in the first frame and identifying no tumor in a plurality of other frames of the sequence; the scanning comprises scanning in a predetermined pattern for the sequence, and the measuring comprises controlling a motor connected with the ultrasonic transducer to return the ultrasound transducer to a position for the first frame when the predetermined pattern is completed and scanning for mechanical property with ultrasonic scanning device; the measurement comprises the derivation of the mechanical property starting, at least in part, from the first frame; the measurement comprises the measurement of a deformation, a rate of deformation, a shear rate, an elasticity, or a Young's modulus; the measurement comprises the derivation of a characteristic of the object other than a response to an energy coming from the imagery; generation comprises generating the image with the alert being a pennant of a location of the ultrasonic transducer for scanning the object; - the identification of objects in other frames of the sequence; and storing the first frame and the other frames; the generation comprises generating the image from the first frame with the alert comprising an indication that the object is detected in the image. In a second aspect, a computer-readable non-transit storage medium has stored therein data representing instructions executable by a processor programmed for alert assistance in ultrasound imaging. The storage medium includes instructions for: generating ultrasound images of a patient with an ultrasound transducer while the ultrasound transducer is moving along the patient; apply target detection to ultrasound images during ultrasound image acquisition; registering an ultrasound transducer location in response to detection of the target in one of the images; notify the detection; acquire data representing the object in response to the detection and using the location; deriving a value of a characteristic of the target from the data; and present the value. According to the variants, the storage medium may comprise one or more of the following features: the application of the detection comprises the application of a computer-assisted detection specific to a type of examination for which the scanning is performed; - the presentation of the value includes the generation of a display of the image with the target highlighted and including the value of the characteristic; - the acquisition of the data includes the use of the image, the rebalancing of the patient at the location, or both, and the derivation of the value includes the derivation of a mechanical property as the characteristic; recording of the location includes recording an ultrasound transducer position electronically or physically; - the notification includes the indication of the location to a user. In a third aspect, a system is provided for alert assistance in ultrasound imaging. A robot connects with a transducer and is configured to move the transducer in a predetermined scanning pattern. A transmission beamformer and a receiver beamformer are configured to scan, with the transducer, a patient with ultrasound while the robot moves the transducer. A processor is configured to apply a computer-assisted detection to scan results, to cause the robot to return the transducer to a detection location by the computer-assisted detection after the end of the predetermined scan pattern, and to derive a mechanical property of tissue based on information acquired with the transducer brought back to the location. A display is usable to display the mechanical property of the fabric with a flag for the results for the location. Other aspects and advantages of the invention are discussed below in conjunction with the preferred embodiments. BRIEF DESCRIPTION OF THE DRAWINGS The components and figures are not necessarily to scale, the emphasis being rather on the illustration of the principles of the invention. In addition, in the figures, like reference numerals denote corresponding parts in the different views. Fig. 1 is a flow chart of an embodiment of a method for alert assistance for an ultrasonic scanner; Figure 2 illustrates an example of a diagnostic mode with an object identified on an image for measuring a mechanical property; and Figure 3 is an embodiment of a system for alert assistance in ultrasound imaging. DETAILED DESCRIPTION OF DRAWINGS AND EMBODIMENTS TODAY PREFERRED The diagnostic workflow is improved in ultrasound medical imaging. During diagnostic mode, the sonographer or robot moves the transducer to examine an organ and / or locate a region of interest. In diagnostic mode, the movement of the transducer allows the scanning of different regions of the patient to examine that organ or a range of regions of the patient for any suspicious objects. For example, a transducer is moved along a patient's breasts to examine the breasts for any tumors or cysts. The patient can be examined to find a region of interest. The transducer is moved to find an organ or location of interest. Part of the patient is examined to find the location of interest. An alert is provided in diagnostic mode to help the user find objects in the patient. The alert is a notification to the user of the object's detection, a flag or a recording of the data or the image representing the object, an indication of the location of the transducer for scanning the object, a setting in light on an image during the observation, or other alert in real time or post-procedure. The user is helped by the alert, such as the alert helping to avoid missing the object and / or indicating where to scan or measure. In a robotic scanning embodiment, image data is supplemented with acoustic and / or mechanical properties in real time. A robot scans according to a predefined standard. An algorithm runs in the background to recognize features, such as dense areas in a breast exam. For all recognized features, the imaging posture (eg transducer position and angle) is recorded and a user is notified. Alerts, such as labels for images with suspicious objects and / or suspicious image backup in addition to the conventional CINE loop, are provided to the user. Acoustic and / or mechanical properties of the target are then examined by returning to the target area, automatically or manually, after scanning with the predefined standard. Alternatively, the property is derived from the saved data of the diagnostic scan. The search for acoustic and / or mechanical properties can be performed with a different device (ie other than the ultrasonic scanner) operated by the robot or a person or performed with the same ultrasound system. Suspicious objects labeled with associated mechanical properties are highlighted in one or more image (s) and / or rendered in volume. Figure 1 shows an alert assist method for an ultrasonic scanner. In diagnostic operation mode, where the transducer moves to examine the patient, a computer-assisted detection indicates locations at which a mechanical property should be measured. This alerts the user to a suspicious region and allows for the collection of necessary data from the workflow while limiting the likelihood that a patient will exit and return for another examination and increasing sensitivity. The method improves medical imaging and scanning to gather information usable by a physician for a medical diagnosis. The method is implemented by the system of Figure 3 or a different system. For example, the ultrasound imaging system generates ultrasound images using beamformers and a transducer. A processor, controller or image processor of an ultrasound imaging system uses computer-assisted detection to identify an object, record the location of the transducer and / or object, and notify the detection. The ultrasound imaging system or other device acquires data for the object, derives a mechanical property from the object, and presents the value as an alert to assist the user. Beam formers, memory, detectors and / or other devices may be used to acquire the data, implement one or more of the actions and / or output the data. The processor may control the devices to implement the method of Figure 1. Additional, different or fewer actions may be planned. For example, the method is implemented without registering the location at action 32 or the notification at action 34. As another example, the method is implemented without presenting the value at action 40. Actions 34 and 40 can be the same action, such that the presentation of the value of action 40 is the notification of detection of action 34. The actions are implemented in the order described or shown (for example from top to bottom), but may be implemented in other orders. The actions 32 and / or 34 may occur before, after or simultaneously with any of the actions 36, 38 and / or 40. At action 26, the ultrasound scanner generates ultrasound images of the patient. The images are generated by scanning the patient. Alternatively, the images are generated by loading frames of data into a memory. The images were generated from a previous scan of the patient, stored in memory. Any type of ultrasound image can be generated, such as in B mode, in flow mode (eg speed or Doppler power), contrast agent, harmonic, pulse wave Doppler (ie spectral Doppler), mode M Tissue Doppler (ie tissue movement) or other ultrasound imaging mode representing acoustic interaction with the patient. The different modes detect different types of information, such as an acoustic feedback intensity (for example mode B and mode M) or a speed (for example flow mode or fabric Doppler). To acquire the images, the ultrasound transducer is moved along the patient. Movement is along a patient's skin, but may be along a vessel or organ within the patient (eg, scanning with a catheter or catheter). The general process for scanning and imaging includes placing the transducer on the patient, rotating and / or translating the transducer to examine the patient, scanning while the transducer is moving, and generating images in the patient. framework of the examination. In this process, the transducer may or may not be lifted from the patient. The images are generated while the transducer is moved along the patient and / or during scanning with the transducer. Alternatively, frames of data are acquired during examination or displacement of the transducer and images are generated later. In diagnostic mode, scanning is performed while the transducer is moved relative to the patient. The movement is that of the transducer rather than moving internal parts of the patient. The transducer moves relative to the patient's tissue in contact with the transducer. The movement of the transducer causes the ultrasound scanner to scan different planes or volumes of the patient at different times. In this way, the patient is examined in time. The scanning sequence acquires data representing different parts of the patient. In one embodiment, the transducer is moved manually by the user or sonographer. The user holds the transducer and slides and / or turns the transducer to examine the patient, such as by translation of the transducer to scan different planes through an organ (eg a breast). In another embodiment, the transducer is moved automatically. A robot sweeps the patient. The robot controls the position and / or movement of the transducer. For example, a robotic arm moves one or more articulation (s) for translation and / or rotation of the transducer. In another example, the robot includes a chain transmission, a screw transmission, a gear, a rack and pinion mechanism, or other transmission for moving the transducer along a plate or a flat or curved surface. Any robotic system known today or later developed with a motor can be used to move the transducer without user force being applied for movement during scanning. An automated volume scan of the breast or other scan may use a robot. The robot and / or the user moves the transducer over a predefined region. A predetermined pattern is used for scanning. For example, the robot moves the transducer to different points, a range of rotation and / or a range of translation to scan. The predetermined pattern defines the spacing, velocity, ranges, and / or steps used in transducer motion. The time and / or position at which each scan takes place can be fixed. Alternatively, scanning is continuous or periodic regardless of position while the transducer is moved. In other embodiments, the examination is performed on any region without predetermination, such as during a manual examination. Figure 2 shows an example of diagnostic mode acquisition 40. Transducer 14 is moved, resulting in data frames 44 being acquired. The different frames 44 represent different parallel planes in this example. In diagnostic imaging mode, a large amount of data is collected. Each scan of a clip or volume provides a frame of data. Data frames are provided for the various positions of the transducer. An image can be generated from each frame. With persistence or other composition, images can be generated from multiple frames. Dozens, hundreds or thousands of frames and / or images are created to examine the patient in an examination (for example, an implementation of the predetermined pattern and / or during a given patient's visit to the patient). 'sonographer). The large number of frames or images is a problem and a challenge to analyze the data and identify suspicious targets. In action 28, the ultrasonic scanner or image processor applies computer-assisted detection to each of the frames or images in the sequence acquired by the scan. The images may be display values (RGB) or scalar data used to generate display values. Frames are images or data at other stages of processing. The scan provides a sequence of frames and / or images. Computer-assisted sensing and other treatments described herein are applied to frames and / or images. Any computer-assisted detection known today or developed later can be applied. For example, pattern matching is used to determine whether a tumor or cyst indicator pattern is located in the frame or image. As another example, thresholding, segmentation or other image processing is applied. For faster detection, a machine-learning detector can be applied. The machine learning detection is a Bayesian network, a carrier vector machine, a neural network, or other detector relating input characteristics of the frame or the image (eg controllable characteristics or Haar wavelets) with suspicious objects. When computer-aided detection is based on machine learning, self-learning or feedback learning can be used. When a physician analyzes detected objects and indicates whether the object is suspicious or of interest, this information can be used as additional learning data for a new learning or update of the detector with additional terrain reality . Computer-assisted detection is applied to all frames. In other embodiments, the detection is applied unless all frames. When the ultrasonic scanner can be configured for a diagnostic mode, the detection can be applied to all the frames. In other approaches, the detection is applied to frames associated with the transducer motion. By using a sensor on the transducer, a knowledge of the operation of the robot and / or a correlation of data to detect a movement of the transducer, the ultrasonic scanner may apply the detection to frames associated with a motion and not to frames where the transducer is stationary. Computer-aided detection is applied when acquiring frames or images. The application is in real time. The period for processing a frame is equal to or less than the period to acquire a new frame, allowing the application to be done in real time, such as performing the application of the detection within one second of the creation of the scan. In other embodiments, the application is in a subsequent process when the scanning is complete. Combinations of detection application during scanning and as a subsequent process can be used. Different computer-assisted detectors can detect different types of objects and / or objects in different situations. Multiple detectors can be applied to each frame. In one embodiment, the detectors to be applied are selected on the basis of the type of examination. For example, the user configures the ultrasound scanner for breast examination. A detector or detectors for detecting suspicious objects (for example tumors and / or cysts) in the breasts is / are selected and applied (s). As another example, the user configures the ultrasound scanner for breast examination to detect cancer. A detector or detectors for detecting cancerous objects in the breasts is / are selected. The selection is automatic by the processor, or the user selects the detectors. In action 30, one or more object (s) is / are identified by the image processor or the ultrasonic scanner. One or more object (s) is / are identified in any number of frames or images. Some frames may be determined not to include any of the target objects. For each or some of the images acquired, targets are recognized. For example, computer-assisted detection locates suspicious objects in five of a thousand images. A tumor or other object is identified. It is possible not to detect objects. Figure 2 shows an example. In the detection operation 42, the object 46 is detected in one of the frames 44. The transducer 14 is positioned at a given location relative to the patient to acquire this frame 44 with the object 46. When computer-assisted characterization rotates in real time, suspect areas are illuminated during a real-time scan while examining the patient. The objects are identified during the scan. For further processing, suspicious areas are located when the scan is complete. When remanence, controlled spatial composition, or other composition is used, the identification may be in non-displayed frames or frames as part of the afterimage or composite imagery. For example, the detector is applied to component frames used for remanence or spatial composition. For diagnostic imaging, the images with remanence (for example temporally filtered) or spatial composition (for example combination of frames with different steering directions) are displayed without displaying the frames or component images. The detection is applied to the frames or component images and / or the frames or images as composed. When the detection is positive, the frame or component image can be displayed without the composition or remanence. As an alternative or additionally, detection or not in a multiple of the component frames can be used to indicate a confidence in the detection. At action 32, the ultrasound scanner or image processor records a location of the ultrasound transducer in response to a detection of the target in one of the images. The detector detects an object in a frame. This frame corresponds to a given position of the transducer. The position of the transducer for the detected frame is recorded. Registration can be performed for all frames so that replacement can be searched for any frames with an object detected in response to the detection. Alternatively, the recording is only performed for frames where the object is detected. Registration is electronic or physical. The transducer location may be followed, such as by an optical system, a magnetic system, or other transducer tracking. The lateral position and orientation of the transducer are followed. These systems electronically record the location based on the calibrated reference frame. Inertial tracking can be used. For a physical record, the position of the transducer is physically marked. For example, the transducer excretes a colored ink or dye at the detection, so that the location is marked on the patient and / or in the acoustic coupling gel. As another example, a marking is indicated on a frame or guide for the transducer. The frame or image can likewise be recorded with or without the identified object. A set of frames or images with detected objects is created for reference by a user. As an alternative, the identity of the frame or image in the sequence of frames or frames is recorded. The frame or image can be recalled from the sequence. At action 34, the image processor or ultrasound scanner notifies the detection. The notification is an exit for the user. Any output can be used. In one embodiment, the output is visual on a display. A text or symbol indicating that an object has been detected is output. The image with the detected object can be output with or without highlighting the detected object in the image as a notification. Another example of output is staining or marking of the patient or acoustic gel. In one embodiment, the notification includes the location. The location is provided as coordinates, a spatial position on the patient (eg, shown on a graph or marked on the patient), a feedback indicating whether the transducer is moving to or at the location, or other indicator of the location. 'location. In additional or different embodiments, the notification is provided by audio. A noise is emitted on detection. The noise indicates that the transducer is at the location in a real-time implementation. As an alternative, the noise is provided as a feedback to guide the user to position the transducer at the location (eg a periodic tone with a higher frequency plus it is close to the location). Other notifications can be used, such as a combination of audio and video. A tactile notification (e.g. transducer vibration) and / or odor may be used. the notification is that of the occurrence, the location of the transducer and / or the scanning plane, the location of the object in the image, other information relating to the detection, and / or combinations of them. The notification is provided at the occurrence of the detection. Once detected, the notification is output so that the transducer can be held at the position to gather additional information to action 36. For robotized implementation, the notification can be used in such a way that the patient can be informed that the examination may take longer than expected. In different or additional embodiments, the notification is provided after a few seconds, minutes, hours or any other period after the occurrence of the detection. For example, when the scanning in the predetermined format is completed, the robotic system notifies that another scan must take place because of the detection of one or more object (s). As another example, a doctor loads the test results for later analysis. The doctor is notified that the detection has occurred. This notification can be a set of images with flags with detected objects, flags, or other information. In one embodiment, the ultrasonic scanner or image processor provides additional detection information to or other than the location of the transducer, the occurrence, and / or location of the object in the image. For example, confidence in the detection is provided. Machine learning classifiers can provide trusted information. Other trusted sources may be used, such as a degree of correlation of the image with a model or a confidence based on fuzzy logic. When multiple frames or frames with the same field of view or overlapping fields of view are provided (eg slow motion transducer and / or imaging composition type), trusts in frames or multiple frames may be combined (eg, averaged) to provide confidence for the particular object. This trust is output to the user as a percentage, color coding, or other indicator. In one embodiment, different colors and / or intensities of the illumination of the object in the image represent different ranges or levels of confidence. Other confidence indicators may be provided, such as the order of presentation of the images. Stored images are provided with the most trusted detections first, last, or in a ranked order by trust. In other embodiments, the user is not notified. Instead, actions 36, 38 and 40 are implemented without separate notification. Action 40 may be a form of notification in the sense that providing a value of a mechanical property indicates that an object has been detected. In action 36, the ultrasonic scanning device, another scanning device (eg X-ray), and / or laboratory test equipment (eg robotic biopsy) acquires data representing the object in question. response to the detection. The same or different measuring tools (for example magnetic resonance, ultrasound, manual striking or other tool) are used to acquire the data for a mechanical property. The data is acquired by measuring the object in the patient. For example, the ultrasonic scanning device scans the object again to gather different or additional data. Alternatively, the data is acquired in a memory, such as using the frame or image data at the location (i.e. data used to detect the object) to derive other information. The acquired data is for a mechanical property. Rather than merely scanning for imaging (eg acoustic feedback intensity (mode B) and / or speed or flux power (flux mode)), data representing a mechanical property or feature of the object itself are acquired. For example, the elasticity, the shear rate, the Young's modulus, the deformation, the rate of deformation or other parameterization of the object is measured. Measurement uses more than just a frame of data or image to derive the characteristics of the object. The measurement can be focused on the object and / or have a smaller field of view than for generating the images for the examination. The image processor, ultrasonic scanner, or other controller causes data acquisition for the detected object. The acquisition is automated, semi-automatic or manual. The controller may display instructions for the user such that the user acquires the data during the examination or a given visit of the patient. The user moves the device for measurement. The controller can perform certain operations automatically, such as measurement and / or positioning for the measurement once the user activates and / or positions. The controller can locate for the measurement (e.g. moves the transducer robotically) and execute the measurement (eg a scan for a mechanical property of the object) without entry of the location, scanning and / or activation by the user. The data is acquired in real time (for example during the same examination) or done later (for example, manually offline). To acquire the data for the mechanical property, the location of the object is used. The ultrasonic scanner, controller or image processor uses the location at which the transducer was when it scanned the object. This location of the transducer and the location of the object relative to the transducer based on the scanning format are used to acquire the data. The data is acquired from the object and the location of the transducer is used to indicate where the object can be found. By using mark and / or position detection, location information is used to position a device for measurement. As represented by the dashed arrow between actions 32 and 36, the recorded location of action 32 may or may not be used to guide the acquisition of other information relating to the object to action 36. recording of the location of the suspicious images can come from a robotic posture, an electromagnetic sensor, an inertial locator, a marker released by a probe, acoustic images themselves, and / or a recorded video scan. For example, the robot positions the transducer to acquire data used to measure the characteristic. A motor connected with the ultrasonic transducer causes the transducer to return to a position to scan the object. The position is the same position used during the examination from which the object was detected. The transducer is returned to the location where the transducer was to scan the patient to acquire the frame in which the object is detected. At the end of the predetermined scan pattern for examination or interrupting the pattern, the transducer is held or repositioned to scan the object again. The mechanical property is then measured. In the manual examination embodiment, the user stops the examination, slows down the examination and / or goes back after the examination to acquire the data. For example, the transducer is stopped or returned to the location to acquire the data (eg, perform shear wave measurements). As another example, the user temporarily suspends the exam at the notification, keeping the transducer in place. The transducer is then used to acquire other data. In the automated or robotic examination embodiment, the robot stops the examination or returns the transducer to the location after the end of the examination. Additional interrogation, such as elasticity imaging, is performed by the ultrasonic scanner after the transducer returns to the location corresponding to the detected object. The acquisition of the data, based on the real-time detection of the object, takes place as part of the examination and / or without the patient leaving the examination. The examination includes the collection of information relating to the suspicious object (s), is done automatically, providing more information or complete information at the end of the appointment or the given examination of the patient . Examination information and other information for a diagnosis are acquired without the patient having to make multiple appointments or be examined multiple times. Detection and data acquisition can also provide a more compact report, such as sending frames or images with detected objects and data acquired with or without the examination so that the radiologist can focus on the information of interest. In action 38, the image processor, ultrasonic scanner and / or other device acquiring the action data 36 derives a value of a characteristic of the target from the acquired data. The acquired data is used to derive the mechanical property. The character of the object is measured as a mechanical property, but other characteristics of the object can be calculated. In response to the identification or detection of the object, the mechanical property is measured. Any mechanical property or other characteristic can be derived. For example, the ultrasonic scanner measures deformation, rate of deformation, shear rate, elasticity, or Young's modulus. These mechanical properties represent more than a response to an imaging energy. For deformation and deformation rate, multiple scans are provided to derive deformation due to tissue movement. For a shear rate, elasticity, or Young's modulus, a fabric is displaced by an acoustic or other force and the tissue response to the displacement or generated wave is measured. The feature of the object is derived from more than mere imagery. Multiple scans can be used to then calculate the time displacement, which is used to derive a shear rate or elasticity. While a frame from the examination may be used as a reference for calculating a displacement, other frames of data are acquired at action 36 to derive the feature of the object. At action 40, the ultrasonic scanner, image processor, or other device has the value or derived values. The value can be useful in terms of diagnosis and gives information in addition to imaging. The value is outputted to the user, such as an output on a display. In one embodiment, a display of the image with the detected object highlighted is generated. The value of the characteristic is provided as an annotation, a label, and / or in the image as a text, a graph, a bar, a color coding and / or a brightness coding. By viewing the display, the patient image and the object are provided as well as an indication of the mechanical property or properties of the object. In one embodiment, the value is provided as or with the notification of the action 34. In other embodiments, the value is provided separately from the notification, such as providing the value after notification and when a subsequent analysis for a diagnosis and / or to confirm the acquisition of the additional data to action 36. In either case, the user is alerted about the characteristic of the object, which helps at the exam. Outputs other than the display may be used. The value can be stored in the patient's file, along with the examination, and / or with records of the location, the object and / or frames with the detected objects. The value is then displayed to the user during the analysis. Action 28 is applied to all or multiple frames or frames of the sequence generated by the scan in diagnostic mode. When objects are detected in multiple frames at action 30, the location and / or frames are / are stored at action 32 and separate notifications are provided at action 34. In other embodiments, a notification with a list of locations, frames, images, or other information about detected objects is provided at the end of the review or later. Since action 32 is repeated for each object, a group of frames with detected objects is identified or joined. This group may be issued separately from the review for review or subsequent analysis. Actions 36, 38 and 40 are executed for each object. Since the user analyzes images of the detected objects with the derived values, the user can indicate that the detection of the object is accurate or not. For example, the value and / or the image may / may show the user whether the object is or is not a tumor, a cyst or other object of interest. The user enters if the detection is accurate or not. This feedback is used as field reality information. The image and feedback is provided for machine learning to update the detector applied to action 28. Figure 3 shows an embodiment of a system for ultrasonic imaging alert assistance. The system alerts by applying object detection, providing assistance. As an alternative or in addition, the system alerts the user of the object detection. The ultrasound system is configured to examine a patient. During the examination, the acquired data are analyzed by the system for objects of interest. If such objects are detected, then the system acquires mechanical property information relating to the object. Rather than simply providing images from an examination, images and mechanical property information for the objects of interest are provided. The system 10 is an ultrasound imaging system for medical diagnosis. In other embodiments, the system 10 is a personal computer, a workstation, a PACS station, or other arrangement at the same location or distributed over a network for real-time or post-acquisition imaging by the system. intermediate of a connection with beamformers 12, 16 and a transducer 14. The system 10 implements the method of Figure 1, the approach of Figure 2 or other methods. The system 10 includes a robot 11, a transmission beamformer 12, a transducer 14, a reception beamformer 16, an image processor 18, a display 20, a memory 22, and a processor 24. Additional components , different or fewer may be provided. For example, a user input is provided for manual or assisted designation of a region of interest within a field of view for mixed mode imaging and / or for configuring the ultrasound system to mixed mode imaging. As another example, the robot 11 is not present. The robot 11 is a motor and a device for moving the transducer 14 with a force from the motor, the robot 11 can have any number of arms and joints. In other embodiments, the robot 11 is a tray supporting a transducer 14 along rails, where the motor moves the transducer 14 along the rails. Gears, chains, screw drive or other devices may be provided for translation of the motor force (eg rotation) into a movement of the transducer 14. Under the control of the processor 24 or other controller, the robot 11 is configured to move the transducer in a predetermined pattern. The movement is constant or in steps. Any pattern may be used, such as moving the transducer 14 along a line from a starting point to a stopping point. Another pattern moves the transducer 14 point-by-point in a regular grid on the patient. The pattern may or may not include a tilting or rotation of the transducer 14. The robot 11 may be configured to move the transducer 14 to particular locations based on object detection. This other movement is performed at the end of the movement for the predetermined pattern. The robot 11 connects to the transducer 14. The connection is fixed or releasable. For example, a gripping member of the robot 11 holds the transducer 14, but can release the transducer 14. As another example, the transducer 14 is fixed by screws, bolts, latches or snap to a holding system of the transducer 14. robot 11. Transmission beamformer 12 is an ultrasonic transmitter, memory, blower, analog circuit, digital circuit, or combinations thereof. The transmission beamformer 12 is configured to generate waveforms for a plurality of channels with different or relative amplitudes, delays, and / or phasing. The waveforms are generated and applied to a transducer array with any rate or frequency of pulse repetition. For example, the transmission beamformer 12 generates a pulse sequence for a B-mode scan in a linear, sectoral or Vector® format. As another example, the transmission beamformer 12 generates a sequence of pulses for color flow scanning, such as pulses to form 2-12 beams in a constant stream sample count per scan line for a region of interest within a B-mode field of view. In yet another example, the transmission beamformer 12 generates a pulse sequence for elasticity or shear imaging. The transmission beamformer 12 may generate a beam for an acoustic radiation force pulse. The intensity of the beam causes a shear wave or a longitudinal wave to be generated from the focal point. Transmission beamformer 12 then generates beams to track the tissue response to the generated wave. The transmission beamformer 12 connects with the transducer 14, such as through a transmit / receive switch. When transmitting acoustic waves from the transducer 14 in response to the generated waves, one or more beams (x) are formed during a given transmission event. The beams are for mode B imaging modes, color flow mode, elasticity mode, shear wave mode and / or other modes. A sequence of transmission beams are generated to scan a uni, bi or three-dimensional region. Industry, vector, linear, or other scan formats can be used. For each position of the transducer 14 or when the transducer 14 moves, a complete scan of the region is performed. Such multiple complete scans are performed with the transducer 14 at different locations or ranges of locations. The transducer 14 is a 1, 1.25, 1.5, 1.75 or 2 dimension array of piezoelectric or capacitive membrane elements. The transducer 14 includes a plurality of elements for transduction between acoustic and electrical energies. For example, the transducer 14 is a one-dimensional PZT matrix with about 64-256 elements. The transducer 14 connects with the transmission beamformer 12 to convert electrical waveforms into acoustic waveforms, and connects with the receiver beamformer 16 to convert acoustic echoes to electrical signals. Transducer 14 transmits beams. To form the beams, the waveforms are focused at a region of tissue or location of interest in the patient. The acoustic waveforms are generated in response to the application of the electrical waveforms to the transducer elements. To scan with ultrasound transducer 14 transmits acoustic energy and receives echoes. Receiving signals are generated in response to ultrasound energy (echoes) impinging on the transducer elements 14. The receiver beamformer 16 includes a plurality of channels with amplifiers, delays, and / or phase rotators, and one or more summers. Each channel connects with one or more transducer elements. The receiver beamformer 16 applies relative delays, phases, and / or apodization to form one or more receive beams (x) in response to each imaging transmission. A dynamic focus on reception can be provided. Relative delays and / or phasing and summation of signals from different elements results in beam formation. The receiver beamformer 16 outputs data representing spatial locations using the acoustic signals received. In other embodiments, the receive beamformer 16 is a processor for generating samples using a Fourier transform or the like. The receiver beamformer 16 may include a filter, such as a filter for isolating information at a second harmonic, a transmission frequency (ie fundamental) band, or other frequency band with respect to the frequency band of the receiver. transmission. Such information may more likely include a desired tissue, a contrast agent, and / or flow information. In another embodiment, the receive beamformer 16 includes a memory or buffer and a filter or adder. Two or more receive beams are combined to isolate information at a desired frequency band, such as a second harmonic, a fundamental cubic, or other band. The receiver beamformer 16 outputs beam summed data representing spatial locations. Data for a single location, locations along a line, locations for a zone, or locations for a volume are outputted. The data formed into beams in response to a complete scan of a region is a data frame. As the transducer moves, such as by the robot 11, the full scan of each region is performed, yielding frames of data representing spatially different fields of view. The image processor 18 is a mode B detector, a Doppler detector, a pulsed wave Doppler detector, a correlation processor, a Fourier transform processor, a filter, another processor known today or later developed for implement an imaging mode, or combinations thereof. The image processor 18 provides detection for the imaging modes, such as including a Doppler detector (for example an estimator), and a mode B detector. A spatial filter, a temporal filter and / or a conversion converter. scanning can / may be included in or implemented by the image processor 18. The image processor 18 outputs display values, such as a detection, a mapping of the detected values to values of displaying, and formatting the display values or detected values to a display format. The image processor 18 receives information formed into beams and outputs image data for display. The processor 24 is a control processor, a general processor, a digital signal processor, a graphics processing unit, a specific integrated circuit, a user programmable gate array, a network, a server, a group of processors, a data path, combinations thereof, or other known or later developed device for detecting objects in images and controlling the ultrasound system accordingly. The processor 24 is separated or part of the image processor 18. As a separate device, the processor 24 requests, receives, accesses or loads data at any stage of processing (for example beamforms, detected, converted by scanning, mapped to a display or other stage) to detect and control. The processor 24 is configured by software and / or hardware to implement or cause the implementation of the actions of Figure 1. The processor 24 is configured to apply a computer-assisted detection to the scan results. The data frames of the receive beamformer 16 and / or any processing stage of the image processor 18 are entered into the computer-aided detection. For example, Haar wavelets, gradients, controllable characteristics and / or others are calculated from each data frame. These characteristics are entered as a feature vector in a machine learning detector. On the basis of these characteristics, the detector indicates whether or not the object is in the image, a location of any object of interest in the image and / or a confidence in any detection. In another example, a pattern or pattern representing the object of interest is correlated by the processor 24 with the data frame in various relative positions. If a sufficient correlation is found, an object of interest is detected. Any computer-aided detection known today or developed later can be used. The processor 24 is configured to control the robot 11. The robot 11 maintains or returns the transducer 14 to a similar or similar position (e.g. with an overlapping field of view) to where an object has been scanned. Based on detection of an object of interest, the processor 24 determines the location of the transducer 14 at the time of scanning for the frame with the object. The transducer 14 is stopped at this position to acquire data for measuring a mechanical property. As an alternative, the predetermined scanning pattern or the motive pattern of the transducer 14 by the robot 11 is completed, and then the processor 24 causes the robot 11 to return the transducer 14 to the location. In other embodiments, the processor 24 generates a notification for the user. For example, a notification is presented on the display 20. As another example, the transducer 14 is controlled to mark (e.g., color) the location on the patient. The processor 24 may be configured to provide feedback to the user for manually positioning the transducer 14, such as indicating a quantity and direction of movement, proximity to the location, or other communication allowing the user to position the transducer 14 at one location or hold the transducer 14 at a common location to acquire additional data. The processor 24 is configured to record the location, the frame with the object, the object detection, the detection confidence and / or other information. The information is recorded with or separate from the image results of the examination. The processor 24 is configured to derive a mechanical property of tissue. The beam formers 12, 16 are controlled to acquire additional data relating to the object once the transducer 14 is in the correct location. For example, elasticity or shear wave tracking is performed. The processor 24 uses the acquired data to calculate a mechanical property of the detected object. The processor 24 or the image processor 18 generates and outputs images or values to the display 20. For example, images in mode B or in mixed mode (for example mode B and flow mode) are delivered in exit. A text, numeric indication or graph can be added and displayed for the user. A graph can be displayed. For example, an annotation marking a detected object, a flag indicating the image as including a detected object, the value derived from the mechanical property of the object, the detection confidence or other information relating to the object is delivered in exit. The images associated with the detected objects are marked with a flag, such as by providing the images on the display 20 separated from a CINE presentation of the examination. The output of the value and / or the illumination of the object may in the same way mark an image as including a detected object. Location information, such as transducer 14, may be outputted. During the examination, the display 20 displays images representing different fields of view or different regions in the patient. Flags, alerts, notification, values or other information may be displayed at this time or during a subsequent analysis. The display 20 is a CRT display, LCD display, monitor, plasma display, projector, printer or other device for displaying an image or sequence of images. Any display known today or developed later may be used. The display 20 can be used to display an image or a sequence of images. The display 20 displays two-dimensional images or three-dimensional representations. The image processor 18, the processor 24, the receiver beamformer 16 and the transmission beamformer 12 operate according to instructions stored in the memory 22 or other memory. The instructions configure the system for carrying out the actions of FIG. 1. The instructions configure the image processor 18, the processor 24, the reception beamformer 16 and / or the transmission beamformer 12 for a first time. operation by being loaded into a controller, causing the loading of a table of values (for example an elasticity imaging sequence) and / or being executed. The memory 22 is a non-transitory storage medium readable by computer. Instructions for implementing the processes, methods, and / or techniques discussed herein are provided on the computer-readable storage medium or memories, such as a cache, a buffer, a RAM, a removable medium, a disk hard or other computer readable storage medium. Computer readable storage media include various types of volatile and nonvolatile storage media. The functions, actions, or tasks illustrated in the figures or described herein are performed in response to one or more instruction sets stored in or on a computer readable storage medium. Functions, actions, or tasks are independent of the particular type of instruction set, storage medium, processor, or processing strategy, and can be implemented by software, hardware, integrated circuits, firmware, a microcode and the like, operating alone or in combination. Similarly, treatment strategies may include multiprocessing, multitasking or parallel processing, and the like. In one embodiment, the instructions are stored on a removable media device for playback by local or remote systems. In other embodiments, the instructions are stored at a remote location for transfer over a computer network or over telephone lines. In still other embodiments, the instructions are stored in a given computer, CPU, graphics unit, or system. If the invention has been described above with reference to various embodiments, it should be understood that many changes and modifications can be made without departing from the scope of the invention. It is therefore intended that the foregoing detailed description be considered illustrative rather than limiting.
权利要求:
Claims (18) [1" id="c-fr-0001] claims An alert assisting method for an ultrasonic scanning device, the method comprising: scanning a patient with an ultrasonic transducer of the ultrasonic scanner in a diagnostic mode in which the ultrasonic transducer is moves relative to the patient; applying computer-assisted detection by the ultrasonic scanning device to each of a sequence of frames acquired by the scanning; identifying an object in a first one of the frames with the computer-assisted detection of the ultrasonic scanning device; measuring, in response to the identification of the object in the first frame, a mechanical property of the object; and generating an image with an alert identifying the first frame and the measured mechanical property. [2" id="c-fr-0002] The method of claim 1 wherein the scanning comprises scanning with the movement of the ultrasonic transducer controlled by a motor. [3" id="c-fr-0003] The method of claim 1 wherein the scanning comprises scanning with the movement of the ultrasonic transducer performed manually by a user. [4" id="c-fr-0004] The method of claim 1 wherein the scanning comprises the robotic scanning and wherein the measurement comprises the robotic measurement. [5" id="c-fr-0005] The method of claim 1 wherein the application comprises applying the computer-assisted detection as a machine learning detector of the usable object to process the frames of the sequence while the scanning is performed. [6" id="c-fr-0006] The method of claim 1 wherein the identification comprises identifying a tumor in the first frame and identifying no tumor in a plurality of other frames in the sequence. [7" id="c-fr-0007] The method of claim 1 wherein the scanning comprises scanning in a predetermined pattern for the sequence, and wherein the measuring comprises controlling a motor connected with the ultrasound transducer to return the ultrasound transducer to a position for the first frame when the predetermined pattern is completed and scanning for the mechanical property with the ultrasonic scanner. [8" id="c-fr-0008] The method of claim 1 wherein the measurement comprises deriving the mechanical property from, at least in part, the first frame. [9" id="c-fr-0009] The method of claim 1 wherein the measurement comprises measuring a deformation, rate of deformation, shear rate, elasticity, or Young's modulus. [10" id="c-fr-0010] The method of claim 1 wherein the measurement comprises deriving a feature of the object other than a response to energy from the imaging. [11" id="c-fr-0011] The method of claim 1 wherein said alert generated with the image is a flag of a location of the ultrasonic transducer for scanning the object. [12" id="c-fr-0012] The method of claim 1 further comprising identifying objects in other frames of the sequence and storing the first frame and other frames. [13" id="c-fr-0013] The method of claim 1 wherein the generating comprises generating the image from the first frame with the alert including an indication that the object is detected in the image. [14" id="c-fr-0014] A computer-readable non-transitory storage medium having, stored therein, data representing executable instructions by a processor programmed for alert assistance in ultrasound imaging, the storage medium including instructions for: generating ultrasound images of a patient with an ultrasound transducer while the ultrasound transducer is moving along the patient; applying detection of a target to ultrasound images during ultrasound image acquisition, application of detection including application of computer-specific detection to a type of examination for which scanning is performed ; registering an ultrasound transducer location in response to detection of the target in one of the images; notify the detection; acquiring data representing the object in response to the detection and using the location, acquiring the data including using the image, rebalancing the patient at the location, or both; deriving a value of a characteristic of the target from the data, the derivation of the value comprising the derivation of a mechanical property as the characteristic; and present the value. [15" id="c-fr-0015] The computer-readable non-transit storage medium of claim 14 wherein presenting the value includes generating a display of the image with the target highlighted and including the value of the feature. [16" id="c-fr-0016] The computer-readable non-transit storage medium of claim 14 wherein the registration of the location comprises recording a position of the ultrasound transducer electronically or physically. [17" id="c-fr-0017] The computer-readable non-transit storage medium of claim 14 wherein the notification includes the indication of the location to a user. [18" id="c-fr-0018] A system for alerting assistance in ultrasound imaging, the system comprising: a transducer; a robot connected with the transducer and configured to move the transducer in a predetermined scanning pattern; a transmission beamformer and a receiver beamformer configured to scan, with the transducer, a patient with ultrasound while the robot moves the transducer; a processor configured to apply a computer-assisted detection to scan results, to cause the robot to return the transducer to a detection location by the computer-assisted detection after the end of the predetermined scan pattern, and to derive a mechanical property of tissue on the basis of information acquired with the transducer brought back to the location; and a display usable to display the mechanical property of the fabric with a flag for the results for the location.
类似技术:
公开号 | 公开日 | 专利标题 FR3048870A1|2017-09-22|ALERT ASSISTANCE FOR ULTRASONIC IMAGING IN DIAGNOSTIC MODE KR102191467B1|2020-12-15|Machine-aided workflow in ultrasound imaging FR2986701A1|2013-08-16|SHEAR WAVE CHARACTERIZATION ON AXIS WITH ULTRASOUND US20190239851A1|2019-08-08|Position correlated ultrasonic imaging KR20150024167A|2015-03-06|Method for generating body markers and ultrasound diagnosis apparatus thereto WO2020146244A1|2020-07-16|Methods and apparatuses for ultrasound data collection US11266380B2|2022-03-08|Medical ultrasound image processing device JP2016087020A|2016-05-23|Ultrasonic diagnostic device KR20140074093A|2014-06-17|Method for ultrasound diagnosis using volume data and apparatus thereto US20170148190A1|2017-05-25|Medical imaging apparatus and method of operating same JP6667991B2|2020-03-18|Ultrasound diagnostic equipment and hospital information system US20200037986A1|2020-02-06|Methods and apparatuses for guiding collection of ultrasound data using motion and/or orientation data WO2020028740A1|2020-02-06|Methods and apparatuses for guiding collection of ultrasound data using motion and/or orientation data US20210052253A1|2021-02-25|Mid-procedure view change for ultrasound diagnostics JP2019103932A|2019-06-27|Medical image processing apparatus, ultrasound diagnosis apparatus, and medical image capturing method KR101553042B1|2015-09-15|Method for ultrasound diagnosis using volume data and apparatus thereto JP2019165836A|2019-10-03|Subject information acquisition device, and control method therefor US20210052255A1|2021-02-25|Ultrasound guidance dynamic mode switching WO2021099214A1|2021-05-27|Systems and methods for obtaining medical ultrasound images CN109414254A|2019-03-01|Control equipment, control method, control system and program WO2021028243A1|2021-02-18|Systems and methods for guiding the acquisition of ultrasound data FR3099984A1|2021-02-26|Changing Ultrasound Imaging Guide Mode Dynamics KR20150103958A|2015-09-14|Method and device for guiding measure in ultrasound diagnosis apparatus, storage medium thereof
同族专利:
公开号 | 公开日 KR20170108884A|2017-09-27| CN107198542A|2017-09-26| KR20190011794A|2019-02-07| US20170265846A1|2017-09-21| DE102017204471A1|2017-09-21|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 JP2009225904A|2008-03-21|2009-10-08|Gifu Univ|System for assisting mammographic diagnosis| US20130310690A1|2012-05-18|2013-11-21|National Taiwan University|Breast ultrasound scanning and diagnosis aid system| EP2823766A1|2013-07-04|2015-01-14|Samsung Medison Co., Ltd.|Ultrasound system and method for providing object information| WO2005010711A2|2003-07-21|2005-02-03|Johns Hopkins University|Robotic 5-dimensional ultrasound| WO2006054635A1|2004-11-17|2006-05-26|Hitachi Medical Corporation|Ultrasonograph and ultrasonic image display method| WO2012033530A2|2010-09-08|2012-03-15|University Of Houston|Devices, systems and methods for multimodal biosensing and imaging| US8831708B2|2011-03-15|2014-09-09|Siemens Aktiengesellschaft|Multi-modal medical imaging| TWI476403B|2011-04-22|2015-03-11|Pai Chi Li|Automated ultrasonic scanning system and scanning method thereof| CN102599938B|2012-02-01|2015-11-25|无锡海斯凯尔医学技术有限公司|A kind of combined probe for elasticity measurement| CN102908168A|2012-10-24|2013-02-06|华南理工大学|A-mode ultrasonic elastic imaging system based on mechanical scanning and method thereof| KR20140091177A|2013-01-10|2014-07-21|삼성전자주식회사|Apparatus and method for lesion diagnosis| WO2014162966A1|2013-04-03|2014-10-09|日立アロカメディカル株式会社|Diagnostic ultrasound apparatus and elasticity evaluation method| CN103300891B|2013-05-29|2015-05-06|北京索瑞特医学技术有限公司|Combined display method and system for quantitative elasticity information and structure information of tissues| BR112015032724A2|2013-06-26|2017-07-25|Koninklijke Philips Nv|elastography system; ultrasound elastography method to inspect an anatomical location; and computer program| WO2015080522A1|2013-11-28|2015-06-04|Samsung Electronics Co., Ltd.|Method and ultrasound apparatus for marking tumor on ultrasound elastography image| KR102251245B1|2014-04-30|2021-05-12|삼성전자주식회사|Apparatus and method for providing additional information according to each region of interest| KR20160014470A|2014-07-29|2016-02-11|삼성전자주식회사|Apparatus and method for supporting computer aided diagnosis based on providing non-visual information, and computer aided diagnosis system| CN104224233B|2014-09-16|2017-05-24|无锡海斯凯尔医学技术有限公司|Image-guided elastic detection system and detection method thereof| CN104398272B|2014-10-21|2017-09-19|无锡海斯凯尔医学技术有限公司|Select the method and device and elastomeric check system of detection zone| KR101623431B1|2015-08-06|2016-05-23|주식회사 루닛|Pathological diagnosis classifying apparatus for medical image and pathological diagnosis system using the same|US11264135B2|2017-11-10|2022-03-01|Siemens Medical Solutions Usa, Inc.|Machine-aided workflow in ultrasound imaging| FR3099985A1|2019-08-19|2021-02-26|Bay Labs, Inc.|Mid-procedure change of view for ultrasound diagnosis| FR3099984A1|2019-08-19|2021-02-26|Bay Labs, Inc.|Changing Ultrasound Imaging Guide Mode Dynamics| US20210295500A1|2020-03-23|2021-09-23|GE Precision Healthcare LLC|Methods and systems for shear wave elastography|
法律状态:
2018-03-12| PLFP| Fee payment|Year of fee payment: 2 | 2020-03-16| PLFP| Fee payment|Year of fee payment: 4 | 2021-12-10| ST| Notification of lapse|Effective date: 20211105 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US15/075,013|US20170265846A1|2016-03-18|2016-03-18|Alert assistance for survey mode ultrasound imaging| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|