![]() METHOD FOR SIMULATING A MEDICAL PROCEDURE AND ARTICLE
专利摘要:
method to simulate a medical procedure and article. embodiments of the invention relate to a method of performing a simulation of a medical procedure. a physical model of an anatomical structure and a medical instrument can be provided. 3D digital models of the anatomical structure and medical instrument can be provided. a location parameter related to a location of a physical medical instrument in relation to the physical model of the anatomical structure can be received. the 3d models of the anatomical structure and the medical instrument can be manipulated according to the location parameter. 公开号:BR112013028273B1 申请号:R112013028273-8 申请日:2012-04-30 公开日:2020-12-15 发明作者:Olef Thaler;Hadar Segal;Niv Fisher;Shachar Llan;Matan Shabat 申请人:Simbionix Ltd; IPC主号:
专利说明:
HISTORY OF THE INVENTION [0001] Many medical procedures, such as pelvic floor reconstruction intervention procedures, can present challenges for even the most experienced doctors. When performing a pelvic intervention procedure, such as reconstructing the pelvic floor, a doctor may create a small incision to gain access to the pelvic cavity. Additional steps in the process usually involve anatomical recognition with the fingers, that is, a doctor's finger serves as the doctor's eyes; or otherwise replaces the senses, and an insertion of a medical device and / or a medical element, for example, a cannula or grid or support mesh. In reality, a doctor may have to perform parts of the procedure without actually seeing the affected organs, medical instruments and / or used medical elements. [0002] Procedures in which visual feedback is limited or not available can present some challenges. For example, training a doctor to perform such procedures can be extremely complicated and possibly put a patient at risk. For example, to train a doctor to perform a pelvic floor reconstruction intervention procedure, an instructor or trainer can physically guide the hand of the person being trained into the pelvic cavity and instruct and / or assist the person who is in training to feel, identify or recognize the organs and tissues in that place. Teaching or training a person in training in a placement of an element such as a support grid can also be done through physical guidance from the person who is training in performing tasks, for example, by having the person being trained and the trainer inserting his fingers into the pelvic cavity. Other methods of preparing (for example, planning an access strategy or equipment selection) or training for invasive procedures in which visibility of the relevant region or organs is limited, may involve the use of a corpse. However, such methods are insufficient since several aspects presented by a patient are not presented by a corpse. Consequently, such methods cannot be used to accurately plan, prepare or train for invasive procedures in which the visibility of regions, tissues, organs, pathologies, pertinent or other aspects is limited. BRIEF DESCRIPTION OF THE DRAWINGS [0003] Modalities of the invention are illustrated by means of examples and not limitation in the figures in the accompanying drawings, in which similar reference numerals indicate corresponding analogous or similar elements, and in which: [0004] Figure 1 shows an exemplary system for simulating an average invasive procedure according to the modalities of the invention; [0005] Figure 2 is a flow chart illustrating a method for simultaneously simulating an image-oriented procedure and presenting medical use image data in accordance with some embodiments of the present invention; and [0006] Figures 3, 4, 5, 6 and 7 show captured images of the screen, exemplary, related to a simulated procedure according to the modalities of the invention. [0007] It will be considered that for the purpose of simplicity and clarity of illustration, the elements shown in the figures were not necessarily drawn to scale. For example, the dimensions of some of the elements may be exaggerated in relation to other elements for clarity. In addition, where deemed appropriate, reference numerals may be repeated between figures to indicate corresponding or similar elements. DETAILED DESCRIPTION OF MODALITIES OF THE INVENTION [0008] In the following detailed description, several specific details are presented to provide a complete understanding of the invention. However, it will be understood by those skilled in the art that the present invention can be practiced without these specific details. In other cases, well-known methods, procedures and components, modules, units and / or circuits have not been described in detail so as not to obscure the invention. Some features or elements described with respect to one modality can be combined with features or elements described with respect to other modalities. For the sake of clarity, the discussion of the same or similar characteristics or elements may not be repeated. [0009] Although modalities of the invention are not limited in this regard, discussions using terms such as, for example, "processing", "computing", "calculating", "determining", "establishing", "analyzing", "verifying" or similar may refer to the operation (s) and / or process (es) of a computer, a computing platform, a computing system, or other electronic computing device, which manipulates and / or transforms the data represented as physical quantities (for example, electronics) within the registers and / or memories of the computer in other data represented similarly as physical quantities within the registers and / or memories of the computer or other non-transitory information storage medium that can store instructions for performing operations and / or processes. [0010] Although the modalities of the invention are not limited in this regard, the terms "plurality" and "one plurality" as used herein may include, for example, "multiples" or "two or more". The terms "plurality" or "a plurality" can be used throughout the specification to describe two or more components, devices, elements, units, parameters or the like. Unless explicitly stated, the method modalities described here are not limited to a specific order or sequence. Additionally, some of the modalities or method elements, described, can occur or can be performed at the same time, or simultaneously. [0011] Reference is made to Figure 1, which shows an exemplary system 100 for simulating a medical procedure according to the modalities of the invention. As shown, system 100 may include computing devices 101, stationary transmit / receive unit 181, a mobile transmit / receive unit 160 and a dummy 170. Device 101 can be any suitable computing device operably connected to the input units and / or output (I / O). As shown, device 101 can include input units 105, output units 120, a model generation unit 110, a simulation unit 115, a management unit 135, a data store 140, a display unit 125, a memory 130 and a controller 131. Input units 105 may include a mouse, keyboard, touch screen or element or any suitable input device. Input units 105 may comprise means for receiving input from physical medical instruments that can be simulated, for example, as described herein. For example, physical objects or instruments such as cables, push buttons and the like, as well as real medical instruments that can be configured to produce output signals, for example, signals related to a movement, location, resistance, orientation or applied force, they can be connected to one or more input units 105 to enable manipulation of a digital simulation of such physical objects or instruments. Input units 105 may include a wired or wireless network interface card (NIC). Specifically, input units 105 can receive input from stationary transmit / receive unit 181 and mobile transmit / receive unit 160. [0012] A communication management unit (not shown) can use a NIC included in the 120/205 output / input units to communicate with a database, system or server storing any relevant information or data to obtain any image generation relevant or other information, data or parameters from such a system, server or database and can store the obtained data, information or parameters in the local data repository 140. Output units 120 may include display screens, components for establishing an interface with a display screen to enable visual output or optionally a speaker or other audio device to enable audible output. Output units 120 may include one or more screens, speakers and / or any other suitable output devices. Output units 120 may additionally include force feedback components that can apply, cause or generate forces or physical resistance (eg, friction-like resistance) to physical devices that can be operated or manipulated by a user, for example, to enable the force feedback effect. For example, simulation unit 115 can control devices enabled for force feedback according to a simulation of a procedure. [0013] Output units 120 and input units 105 can communicate with any other component or device units 101 and thus can enable such units to communicate with external systems. Units 105, 110, 115, 125 and 135 can be or can comprise software, hardware, firmware or any suitable combination thereof. For example, any of a model 110 generation unit, simulation unit 115 and / or presentation unit 125 can be implemented as an application specific integrated circuit (ASIC), for example, placed on an additional card that can be installed on computing device 101, or these units can be implemented as a field programmable port arrangement (FPGA) module, a chip or any other unit or module, possibly including specifically designed firmware and hardware. [0014] Stationary transmit / receive unit 181 and mobile transmit / receive unit 160 can transmit and / or receive any information, parameter or signal to / from device 101. Stationary transmit / receive unit 181 and mobile transmitting / receiving unit 160 can transmit and / or receive any information, parameter or signal to / from each other, for example, transmitting / receiving unit 181 can receive a signal that can be received by the transmitting unit / mobile reception 160 and transmitting / receiving unit 160 can transmit a signal that can be received by stationary transmitting / receiving unit 181. Stationary transmitting / receiving unit 181 and mobile transmitting / receiving unit 160 can be a transmitter of very low power radio frequency (RF) with high bandwidth input, possibly provided in the chip scale packaging, can be combined with a chip or circuit and processing and can transmit and / or receive information via an antenna. The stationary transmit / receive unit 181 and the mobile transmit / receive unit 160 can include a power source such as one or more batteries. [0015] The model generation unit 110 may include components, or modules to generate a digital model and its graphic representation, for example, a three-dimensional (3D) anatomical model of an anatomical structure, such as a pelvic cavity or any other structure anatomical, organ or area of interest related to an individual's body. A 3D digital model can be generated by the model generating unit 110 according to the invention received from any suitable source, for example, an imaging system, for example, a medical image received from a CT system via input unit 105. In other modalities, information based on which a digital model can be generated can be received, for example, from a manufacturer or provider of a doll or mannequin, for example, mannequin 170. It will be recognized whereas modalities of the invention are not limited by the method or system to generate a digital 3D model of an anatomical structure, any methods or systems can be used to generate such a model without departing from the scope of the invention. [0016] Simulation unit 115 may include components to generate a simulation of an invasive medical procedure. For example, when a user performs a simulation, for example, as a pre-procedure for an invasive medical procedure, using simulation unit 115, a graphical representation of a digital model (for example, produced by the model generation unit 110 ), and the simulation process, can be displayed on a screen that can be one of the 120 output units. A generation of a digital model of an anatomical organ, system, section or region (for example, through the generation unit model 110) and a simulation of a procedure (for example, via simulation unit 115) may be according to the methods, systems and / or other aspects as described in US Patent Application Publication 2009/0177454. As further described here, a simulated invasive medical procedure may include performing operations on the dummy 170 and manipulating a digital model according to the operations performed on the dummy 170. The manipulation of a digital model can be based on signals transmitted by the transmission unit / stationary reception 181, received by the mobile transmission / reception unit 160 and provided with a controller, for example, controller 131 that can process the received signals, determine a location of the mobile transmission / reception unit 160 and update, in real time, one or more 3D digital models based on the location of the mobile transmitting / receiving unit 160. In some embodiments, determining a location of the hand of a doctor, medical instrument, or an element can be performed using one or more stationary devices without using any mobile transmitter or receiver. For example, a device configured to project light (which may be invisible) or other forms of energy and additionally configured to detect reflected light or other energy from objects in space, can be used. For example, an infrared (IR) light or other form of electromagnetic radiation can be emitted by a device located near the 170 dummy. The emitted IR light can be returned (for example, reflected) by a medical instrument, by the hand of a doctor or other objects. The reflected IR light can be received via the emission or another device and, based on the properties of the reflected light, the location, orientation or other aspects of a medical instrument or any other object, can be determined. It will be understood that the modalities of the invention are not limited by the system or method used to determine a location, position or orientation of objects in a space close to the mannequin 170. [0017] Controller 131 can be any suitable controller or processing unit, for example, a central processing unit (CPU) processor. Memory 130 can be any suitable memory component, device, chip or system and can store applications or other executable code that can be executed by controller 131 and / or data, for example, data that can be used by applications or programs that are running by controller 131. For example, executable code, applications or modules implementing model 110 generation unit and / or simulation unit 115 can be loaded into memory 130 and executed by controller 131. [0018] It will be recognized that device 101 as described herein is an exemplary system. According to the modalities of the invention, device 101 can be implemented in a single computing device or alternatively, in a distributed configuration, in two or more different computing devices. For example, the model 110 generation unit can operate on a first computing device and be managed by a first management unit whereas simulation unit 115 can operate on another computing device and be managed by a second management unit that communicates with the first management unit. In another exemplary embodiment, the management unit 135 can operate on a computing device, the model 110 generation unit can operate on a second computing device; and the simulation unit 115 can operate on a third computing device. [0019] The presentation unit 125 can control, coordinate, or manage, a display or presentation of video, audio or other aspects of a simulated procedure. For example, the display unit 125 can receive data, parameters or other information from a plurality of sources and incorporate the data received or obtained for a presentation to a user. Presentation unit 125 may coordinate, synchronize or otherwise relate a presentation of information from a plurality of sources within a single presentation. For example, in a single presentation, the display unit 125 can use information and / signals received from the simulation unit 115, from the stationary transmit / receive unit 181 and from the mobile transmit / receive unit 160. The management unit 135 can interact with any applicable module, unit, application or other entity and can perform coordination, programming, arbitration, supervision and / or management of flows, procedures or other aspects as described herein. [0020] The data repository 140 can be any component capable of storing digital information. Data repository 140 may include or may be, for example, a hard disk drive, a floppy disk drive, a compact disk drive (CD), a recordable CD drive (CD-R), or another storage drive removable and / or fixed, suitable. Data repository 140 may include or may be a USB storage device or a FLASH storage device. Although shown as included in device 101, in some embodiments, data repository 140 may be an external device, component or system, for example, a network storage device or system operably connected to device 101. It will be recognized that the scope of the the present invention is not limited or otherwise affected by the type, nature, operational and / or model aspects of data repository 140. For example, data repository 140 may comprise any suitable number of possibly different storage devices without departing from the scope of the present invention. Any information, data or parameters required by device 101 to perform or participate in a simulation of an invasive procedure can be stored in data repository 140. For example, management unit 135 can interact, for example, through a network and possibly according to and / or by implementing a predefined protocol, with any external data repository and can thus receive any relevant information, for example, provided by a mannequin manufacturer 170 or by a manufacturer or provider of medical simulation instruments and can be received and stored in data repository 140. [0021] The dummy 170 can be any object representing a patient or individual. For example, the mannequin 170 can be a partial or complete representation of real size and a human body, for example, a doll or other model made of plastic or any other suitable material or substance. Dummy 170 may include physical models of internal parts or organs, for example, a pelvic pelvis, blood vessels, bones and the like. Generally, any part or organ of a patient that may be relevant to a simulation of a procedure can be included on the 170 manikin. The 170 manikin can be stationary. For example, mannequin 170 can be securely attached to a table or support tray in such a way that it cannot be moved. In some embodiments, although the dummy 170 may be stationary, some parts, for example, a limb, internal parts or head, can be moved, for example, to approximately simulate a real patient. [0022] Manikin 170 can be positioned, in any way suitable for simulated operation or procedure. For example, the dummy 170 can be placed or positioned on its back or it can be placed seated or assume any applicable position. A digital model of mannequin 170 can be updated or configured according to the position that can be assumed by mannequin 170. For example, a digital model can take any position assumed by mannequin 170. Mannequin 170 can include replaceable parts. Consequently, the dummy 170 can be used to simulate various patients, conditions, pathologies or any applicable aspects of a simulated patient. For example, several parts (which can be external or internal) can be changed, replaced or modified based on a simulated patient's gender, a medical condition or a pathology. A digital model of mannequin 170 can be updated or configured according to any parts or attributes of mannequin 170. For example, the digital model representing mannequin 170, a part of mannequin 170, or any other physical model of an anatomical structure can be updated according to any part included in the 170 dummy or a physical model of an anatomical structure. For example, one of some physical models of internal organs can be mounted on the 170 mannequin (for example, a bone structure or any internal organ). A digital model representing the 170 dummy and specific parts, adapted or installed, can be used on the 170 dummy. [0023] The stationary transmit / receive unit 181 can be securely attached to a table or support tray in such a way that it cannot be moved. The location and / or orientation of the stationary transmit / receive unit 181 can be known and / or recorded. For example, the distance of stationary transmitting / receiving unit 181 from mannequin 170 (or a specific part of mannequin 170) can be known. The location and / or orientation position of a stationary transmit / receive unit 181 can be recorded, for example, stored in data repository 140 and / or loaded into memory 130. Generally, any information or parameter related to a location, position or orientation of the stationary transmit / receive unit 181 and the dummy 170 can be known and recorded. Consequently, the location, position and / or orientation of the stationary transmit / receive unit 181 with respect to or relevant to a location and / or orientation of the dummy 170 (which, as described here, can be stationary and its location and / or orientation can be known and / or recorded) can be known and / or recorded. As described here, using information or data related to a location, position and / or orientation of the stationary transmitting / receiving unit 181 and dummy 170 and by determining a location, position and / or orientation of a transmitter (for example , by analyzing a signal transmitted by the transmitter and received by the stationary transmission / reception unit 181, the location, position and / or orientation of the transmitter with respect to the dummy 171 can be determined. [0024] A signal transmitted by the mobile transmitting / receiving unit 160 can be received by the stationary transmitting / receiving unit 181. A signal transmitted by stationary transmitting / receiving unit 181 can be received by the mobile transmitting / receiving unit 160. In some embodiments, based on the processing of the signals transmitted by the mobile transmitting / receiving unit 160 and received by the stationary transmitting / receiving unit 181, the location of the mobile transmitting / receiving unit 160 can be determined. For example, the electromagnetic field created by a signal transmitted by the mobile transmitting / receiving unit 160 can be known. For example, parameters or coefficients such as amplitude, attenuation, frequency and the like can all be known (for example, based on the specification provided by a manufacturer of a transmit / receive unit). Similarly, an electromagnetic field created by a signal transmitted by the stationary transmit / receive unit 181 can be known and a signal received by the mobile transmit / receive unit 160 can be analyzed and / or processed to determine the location of the transmit / receive unit. mobile reception 160. [0025] In addition, one or more signals transmitted by the mobile transmission / reception unit 160 may not be homogeneous or constant in space or direction. For example, the mobile transmit / receive unit 160 can transmit a first electromagnetic signal in a first direction, for example, along the X axis in a given coordinate system, and another signal in a second direction, for example, along of a Y axis in the same coordinate system. Consequently, by analyzing a signal received from the mobile transmitting / receiving unit 160 by stationary transmitting / receiving unit 181, the location and orientation of the mobile transmitting / receiving unit 160 can be determined, monitored and / or tracked. For example, the stationary transmit / receive unit 181 may include a three-axis orientation sensor capable of measuring yaw, pitch and rotation based on a magnetic field. Consequently, based on a known magnetic field induced by the mobile transmitting / receiving unit 160, the location, position and / or orientation of the mobile transmitting / receiving unit 160 can be determined. The distance and / or location of the stationary transmit / receive unit 181 with respect to the dummy 170 can be known, consequently, the location, position and / or orientation of the mobile transmit / receive unit 160 with respect to the dummy 170 can be determined. [0026] In some modalities, the components or systems that may be commercially available can be used. Any system that provides or enables elements such as dynamic and / or real-time measurements of position (eg Cartesian X, Y and Z coordinates) and / or orientation (eg azimuth, elevation and rotation) can be used. For example, products available from Polhemus® and / or NDI® can be used to track a medical instrument, finger or element used to perform a simulated procedure. Using the tracked information provided by a tracking system and a location, position, orientation or other spatial parameters of a doll or mannequin (for example, mannequin 170), digital models of one or more of a tool, finger or element and a digital models of a doll or mannequin can be manipulated (and displayed), in real time, in such a way that digital models adequately and closely represent one or more of an instrument, finger, element or doll. Consequently, a user performing a simulated procedure (for example, inserting an element such as a support mesh on the mannequin 170) can be provided with a comprehensive view, from any desired angle and any desired section of the used mannequin or doll, as well as other elements, for example, an instrument, finger or element. [0027] In some embodiments, several stationary transmit / receive units that may be similar to the transmit / receive unit 181 can be used. Each of such a plurality of stationary transmit / receive units can transmit a specific signal, for example, a signal having a specific, known frequency, amplitude or other characteristic. Consequently, several different signals can be received by the transmit / receive unit 160. The transmit / receive unit 160 can communicate any parameters related to the received signals to controller 131 which it can calculate, based on the parameters received from the transmit / receive unit. receiving 160, the location of the transmitting / receiving unit 160. For example, by analyzing the strength of the signals received by the transmitting / receiving unit 160 from each of the plurality of stationary transmitting / receiving units, the distances from the transmitting / receiving unit 160 transmit / receive 160 from stationary transmit / receive units, can be calculated. [0028] Using calculated distances and known locations, arrangement or placement of stationary transmit / receive units, the location of transmit / receive unit 160 can be determined by controller 131. For example, using known coordinates of transmit / receive units stationary reception and known attributes of signals transmitted by the stationary units, the distances of the mobile transmitting / receiving unit 160 from the stationary transmitting / receiving units can be calculated. Based on a number of distances from a respective number of known coordinates, the location of the mobile transmitting / receiving unit 160 can be determined. The distances and / or locations of the stationary units with respect to the dummy 170 can be known, consequently, the location of the mobile transmission / reception unit 160 with respect to the dummy 170 can be determined. [0029] Modalities of the invention may be applicable to various medical procedures. In particular, modalities of the invention may be applicable to anterior and / or posterior repair procedures that can be performed to treat pelvic organ prolapse. Although the description presented here is mainly related to a pelvic floor reconstruction intervention procedure, it will be understood that modalities of the invention may be applicable to several other procedures. For example, modalities of the invention may be applicable to pregnancy tests or procedures related to the treatment or diagnosis of the cervix. Generally, pelvic organ prolapse (POP) occurs when a pelvic organ (for example, a bladder) falls (undergoes prolapse) from its normal location in the lower pelvis, and forces against the walls of the vagina. For example, as a result of stretched and weakened muscles. Patients suffering from tension during childbirth, lasting cough, tumors in the pelvic organ can be associated with high risk factors for POP. Known conditions or related pathologies may be, for example, apical vaginal prolapse (enterocele), anterior vaginal wall prolapse (cystocele) and posterior vaginal wall prolapse (rectocele). [0030] A pelvic floor reconstruction intervention procedure (for example, anterior or posterior repair procedure) may include the insertion of an element, for example, an element that can be used as a mechanical support or bonding material for imperfection of fascia. For example, one element may be the PS GYNECARE GYNEMESH® mesh that can be used to strengthen tissue and stabilize pelvic floor fascia structures in cases of vaginal wall prolapse. When performing a pelvic floor reconstruction intervention procedure, a doctor can insert his finger to develop or create a Paravaginal and / or Retovaginal space. The procedure may also include the palpation of the relevant reference points, insertion of a guide and discovery (by touch) of a suitable place for placing an element (for example, support mesh). A cannula can then be inserted (from a different direction) in such a way that it reaches where the doctor's finger is. A guide can be inserted to guide the cannula into position. The guide can be removed by leaving the cannula in place. A mesh can then be connected to the recovery wire and pulled into place by pulling the wires. [0031] Consequently, a pelvic floor reconstruction intervention procedure can be an example of a procedure in which deviation from a path or loss of a location can put a patient at risk for sensitive organs and tissues to be damaged. Exemplary damages can be: a bladder perforation, nerve injury, rectal intestine, hemorrhage as a result of blood vessel perforation, etc. However, and as discussed above, current methods do not allow a physician to train adequately for such a procedure. As described here, modalities of the invention can enable training for such an intervention procedure by providing 3D models of an anatomy, a hand, a medical instrument and / or an element and additionally manipulating, in real time, the models according to the operations and / or movements of a person in training. Indications, warnings, alerts and guidance can all be provided, in real time, during a simulated procedure. [0032] According to the modalities of the invention, a simulated procedure may comprise a graphic representation of an anatomical model (for example, a 3D digital model) that can be displayed on a screen with additional information, such 3D digital models of instruments, fingers and / or elements. In some modalities, a 3D digital model of an anatomical structure or organ and instruments, fingers and / or elements may exhibit real anatomical or physical qualities, peculiarities, characteristics, natures or aspects, for example, moving, bending, contracting, reacting to pressure or medication, bleeding etc. A simulation of a medical procedure can comprise an image or graphic representation of an anatomical organ, for example, a model as described here, which can be rotated or otherwise positioned, or can be done in a way that mimics a real anatomical system, for example, change or evolve over time, change shape in response to an operation, or an interaction with a medical instrument or substance, bleed, or otherwise exhibit or exhibit behavior of the anatomical organ and related instruments, drugs or other aspects. For example, a grid, mesh, catheter, stent, cannula, guide, syringe, needle or other instruments, devices or elements can all be shown and additionally simulated by a 3D digital model. Consequently, a doctor may perform a simulation of a medical procedure such as a pre-procedure for effective surgery (for example, surgical trial or surgical simulation), as part of a planning procedure, or a training session or as a post procedure . [0033] Reference is made to Figure 2, which shows an exemplary flow chart describing a method for simulating a medical procedure according to some modalities of the invention. It will be understood that some of the elements described in the flowchart can occur or be performed simultaneously, at the same time, or simultaneously. In some modalities, some of the elements shown in Figure 2 can be omitted, in other modalities, the order of operations shown in Figure 2, can be changed. As shown by block 210, the method may include providing a physical model of an anatomical structure and providing a digital 3D model of the anatomical structure. For example, the mannequin 170 can be provided and a 3D digital model generated by the model generation unit 110. In other embodiments, a 3D digital model can be provided by a mannequin manufacturer 170. [0034] Reference is further made to Figure 3 which shows a captured image of the screen, exemplary, including a 3D digital model, simulated, according to the modalities of the invention. As shown by 310, a simulated 3D digital model can include any portion, organ or anatomical structure or system, for example, bones, blood vessels, muscles and the like. A digital model can be used in several ways. For example, the digital model can be used to approximately imitate a simulated procedure performed on a doll. In other modalities, a simulated 3D digital model can be used in some training methods. For example, to teach or train a doctor to recognize (for example, just by touch) the internal parts, the person in training can be instructed to play a specific organ and a system can provide an indication as to whether the organ was in reality touched. For example, a user can be instructed to touch a specific muscle on a doll, the user can touch the muscle on the doll (for example, on the dummy 170) and an indication of the muscle actually touched can be provided as shown by 320. Similarly, a user can be trained to reach specific areas within a doll and can be provided with an indication of whether the area has been reached or not. [0035] As shown by block 215, the method may include the provision of a physical medical instrument and the provision of a 3D digital model of the physical medical instrument. For example, a cannula can be placed on a guide, so by tracking the guide (for example, based on a sequence of determined locations), the location of the cannula can be tracked or determined. A cannula or guide can be used in a placement and mesh in a pelvic floor reconstruction intervention procedure and can be provided and used by a person in training in a simulated procedure. For example, a physical medical instrument and dummy 170 can be used, by a person in training, to simulate a medical procedure. A 3D digital model of a medical instrument can be generated by the model generation unit 110 which can be provided with any required information. For example, the model generation unit 110 can be provided with the dimensions of a medical instrument and / or other aspects, for example, elasticity, and can generate a 3D digital model of the tool. As shown by block 220, the method may include placing a transmitter next to a physical model of the anatomical structure and attaching a receiver to the physical medical instrument. For example, the stationary transmit / receive unit 181 can be placed close to the dummy 170 and the mobile transmit / receive unit 160 can be attached to a medical instrument provided as described with respect to block 215. As described here, the flow can include determining a location of the medical instrument (or a doctor's finger or hand). In some embodiments, instead of, or in addition to placing a stationary transmitter in a known location with respect to a mannequin and attaching a mobile receiver to a medical instrument, in a hand, or finger, a stationary receiver can be used, for example , in conjunction with a mobile transmitter. In such embodiments, the mobile transmitter can transmit one or more signals that can be picked up or received by one or more stationary receivers. [0036] The receivers can transmit the signals received from the mobile transmitter to a controller that can determine the location of the mobile transmitter based on one or more signals received by the stationary receivers. For example, provided with information such as the location of one or more receivers, and any relevant attribute or parameter related to the signals transmitted by the mobile transmitter that can be attached to the medical instrument or finger (for example, signal distribution in space, amplitude, attenuation, frequency and the like), a controller (e.g. controller 131) can determine a location and / or orientation of the medical instrument, finger or any object to which a mobile transmitter is attached. In some embodiments, to determine an orientation, two or more transmitters or receivers can be attached to a medical instrument or finger. For example, by attaching two transmitters to a medical instrument (which can individually transmit a different signal), the location of the two transmitters can be determined. Using the locations of the two transmitters and knowing how these two transmitters are connected to a medical instrument, finger or other object, the exact space orientation of the instrument, finger or other object can be determined. It will therefore be understood that any arrangement of any number of stationary and / or mobile transmitters and / or receivers can be used to enable a controller to determine a location and / or orientation of a medical instrument, a user's finger or any other object. [0037] As shown by block 225, the method may include determining a location and / or orientation of the medical instrument based on a signal transmitted by the transmitter and received by the receiver in which the location and / or orientation relate to a location, position, orientation or other aspect of the physical model. For example, the location of a medical instrument can be determined by determining, for example, as described here, a location and / or orientation of a transmitter that can be attached to the medical instrument. By using one or more parameters related to a mannequin location, the location of the medical instrument in relation to the mannequin can be computed and / or determined. The orientation of the dummy can be known, for example, stored in the data repository 140 and / or loaded into memory 130, so, for example, any parameter related to a location and / or orientation of the dummy 170 can be available to the controller 131. Consequently, using the known location and / or orientation of a medical instrument and a mannequin, 3D models of the medical instrument and the mannequin can be displayed, for example, displayed on a screen connected to a computing device, where the models they are displayed in such a way that the respective or relevant locations and the orientations of the 3D models are in accordance with the locations and orientations of the physical medical instrument and the dummy. [0038] Reference is also made to Figure 4 which shows a captured image of the screen, exemplary, including 3D digital models, simulated, of an anatomical structure, an instrument and an element according to the modalities of the invention. Upon determining a location and / or orientation of the medical instrument and / or an element in which the location and / or orientation relate to a location, position, orientation or other aspect of the physical model, modalities of the invention may present adequately a model of the physical model (for example, mannequin 170) and a model of a medical instrument and / or element (and / or a finger or other object) in such a way that the respective locations and orientations of the digital models are the same as the locations and orientations of the real (physical) entities. For example, the locations, positions and orientations of instrument 420 and element 430 with respect to model 410 may be substantially the same as the locations, positions and orientations of an actual instrument and / or physical element with respect to an individual's physical model human (for example, dummy 170). [0039] As shown by block 230, the method may include performing, through a user using a physical medical instrument, a simulation of a medical procedure. For example, a simulation of a pelvic floor reconstruction intervention procedure can be performed by a user using the medical instrument and mannequin 170. As shown by block 235, the method may include manipulation of the 3D digital models of the anatomical structure and the medical instrument according to the location of the physical medical instrument. For example, simulation unit 115 can manipulate 3D digital models of the anatomical structure and the medical instrument according to a location and / or orientation of the medical instrument and according to any attributes of the 3D digital models. For example, a 3D digital model of an anatomical structure can be made to mimic a reaction of an anatomical structure to an intervention by the medical instrument (for example, curvature, stretching, bleeding, etc.). [0040] For example, a 3D model of a medical instrument can be moved, rotated or caused to change shape based on a location, position or orientation of a related physical instrument. A 3D model of a mannequin can be manipulated based on the location, position or orientation of a medical instrument. For example, a modeled tissue or organ included in a 3D digital model of a mannequin can be made to bend, stretch or otherwise change shape, location or orientation based on a medical instrument's position, location or orientation, for example. example, to simulate an interaction of a medical instrument with a mannequin. Consequently, 3D models of a medical instrument and mannequin can be manipulated in such a way that they duplicate, imitate, replicate, repeat, copy or reproduce approximately any movement or other aspect of the physical medical instrument and mannequin. The 3D model of the manikin can be manipulated in such a way that it mimics or reproduces the response or interaction of a real individual, patient or physical mannequin with the medical instrument. [0041] As shown by block 240, the method may include attaching a transmitter to a user's finger, providing a digital 3D model of the finger and receiving a location and / or an orientation parameter related to a location and / or a finger orientation, where the location and / or orientation concerns a location, position, orientation or other aspect of the physical model. For example, one or more transmission units (for example, units similar to the mobile transmit / receive unit 160) can be attached to the doctor's finger or hand, a 3D digital model of the hand or finger can be displayed on a computer screen and can additionally be manipulated according to a movement of the finger. For example, the location and / or orientation of the finger can be determined based on the location and / or orientation of the attached transmitter as described here. Reference is also made to Figure 5 which shows a captured image of the screen, exemplary, including 3D digital models simulated of an anatomical structure and a finger, using a location parameter related to a location, orientation or position of a finger with respect to to a location of a physical model, a simulated digital model of a finger as shown by 520 can be displayed, with respect to a simulated digital model of an anatomical system as shown by 510 where the two models are displayed in such a way that their locations , respective positions or orientations are substantially the same as those of a related real finger of a user of a real physical model. [0042] As shown by 530, a screen can be divided into a number of regions. For example, model 510 and finger 520 can be displayed in a first region and an anatomical tree 530 can be displayed in a second region. In one embodiment, an anatomical tree can include checkboxes that can be used to include selected elements or attributes in a digital model or exclude those elements or attributes. For example, a user can select to show only a specific muscle (for example, while hiding or obscuring any other anatomical parts or information via a digital model of skin tissue). Consequently, a training level can be established, for example, more and more anatomical regions or organs can be omitted from a digital model to increase expertise and / or to provide better experience in real life. In some embodiments, an anatomical tree can include families or groups of selectable elements. For example, elements such as bone structures and blood vessels can be selected based on a selection or indication of a simulated patient's age, gender or other aspects. For example, a first set can be selected automatically for a young man and a second set can be selected for an elderly woman. Pathologies can be selected, for example, low blood pressure, pathologies related to muscles etc. In some modalities, physical parts modeling a pathology, a gender, an age or other aspects of a simulated patient can be installed on a mannequin. A 3D digital model can be automatically synchronized with a set of physical parts modeling an aspect of a patient, for example, the set of removable or replaceable elements included or installed on the dummy 170. For example, the mobile transmitting / receiving unit 160 can receiving a signal or information (for example, an identification code) from a passive or active component installed on a replaceable physical part installed on the dummy 170, transmitting the identification code to controller 131 (for example, to the management 135) that can instruct model generation 110 to generate a model according to the specific parameters selected based on an identification code. Similarly, different mannequins (for example, a set modeling a child, an adult and a baby) can be automatically identified and a pre-configured 3D digital model can be selected based on the identification. [0043] Reference is also made to Figure 6 which shows a captured image of the screen, exemplary, including 3D digital models, simulated of an anatomical structure and a medical instrument. The use of a location parameter related to a location, orientation or position of a medical instrument with respect to a location of a physical model, a simulated digital model of a medical instrument as shown by 620 can be displayed, with respect to a model simulated digital as shown by 610 where the two models are displayed in such a way that their respective locations, positions or orientation are substantially the same as those of a related real medical instrument and a real physical model. It will be understood that models such as those shown in Figures 3, 4, 5 and 6 can be manipulated. For example, models can be manipulated in such a way that they copy dynamically and in real time approximately or are otherwise related to the positions, locations or other related spatial aspects of real, effective entities, for example, a dummy or dummy , a medical instrument and / or the finger or palm of a person in training. The anatomical tree shown by 630 may be similar to the anatomical tree shown by 510. [0044] As shown by block 245, the method may include the manipulation of 3D digital models of the anatomical structure and the finger according to the location and / or orientation of the finger. For example, any movement, position or change of location of the doctor's finger or hand can be duplicated, imitated, replicated, repeated, copied or reproduced approximately by the 3D digital model of the hand or finger. The 3D model of the manikin can be manipulated according to any movement, position or change of location of the doctor's finger or hand. For example, the 3D model of the manikin can be manipulated in such a way that it mimics or reproduces the response or interaction of a real object, patient or physical manikin with a doctor's finger or hand. [0045] As shown by block 250, the physician may include attaching a transmitter to an element, providing a 3D digital model of the element and receiving a location and / or an orientation parameter related to an element's location and / or orientation, the parameter related to a location and / or orientation of the physical model. For example, a transmitter can be connected to a grid or mesh of physical support inserted and / or placed during a pelvic floor reconstruction intervention procedure. Using the techniques described herein or any other technique, the location of the mesh can be determined or known. Consequently, a 3D model of the element can be manipulated to reflect the physical location, position or any other geometric aspect of the mesh. In some embodiments, some transmitters can be connected to an element, for example, to better determine the position of the element. For example, some transmitters can be connected to a number of locations on an element, for example, at the ends and center of a mesh. Consequently, the exact location, position and layout of the mesh can be determined and can be presented by a corresponding 3D model of the mesh. [0046] As shown by block 255, the method may include providing a user with feedback related to the performance of the medical procedure based on a location and / or guidance from at least one among: the finger, the physical medical instrument and the physical element. For example, display unit 125 can display any feedback. For example, an ideal path can be pre-configured and / or provided to controller 131. For example, coordinates of an ideal path for insertion of a cannula (for example, as determined by an expert) can be determined and stored, for example. example, in data repository 140. Using a sequence of tool, finger and / or element locations, controller 131 can compare the path of an instrument, a finger and / or an element with a pre-configured path . Controller 131 can guide a user. For example, by comparing the route taken by a user by inserting a cannula into a preferred route, controller 131 can determine that the user deviates from the preferred route or is moving away from a preferred location and can instruct or assist the user in return to, or keep a preferred route or reach a preferred location. For example, at any point during the simulated procedure, controller 131 can provide guidance to a user, for example, "guide the cannula to the left", "move your fingers up", "move the guide forward by 3 centimeters" etc. Consequently, modalities of the invention may make it possible to carry out an image-guided computer simulation of a procedure. [0047] According to the modalities of the invention, the provision of feedback can be performed simultaneously or concurrently with the performance of a related simulation of a procedure, or otherwise it can be at the same time. In some embodiments, the feedback provision can be synchronized or otherwise coordinated with a progress, state, mode, context or any relevant aspect of a simulated procedure. Feedback can be provided in conjunction with the simulated procedure, for example, while the simulated procedure is in progress. For example, a single screen can be used to display feedback and one or more 3D models, for example, 3D models of a tool, finger and / or element. In some embodiments, a location on a tool, finger or element can be used to control power feedback or other devices. For example, based on a location of a tool (for example, if the tool is in contact with tissue or bone) a force feedback control signal can be generated in such a way that a physical tool used by a person in training should provide the trainee with an appropriate feeling or feedback. [0048] As shown by block 260, the method may include determining a score related to the performance of the medical procedure and recording the score. For example, by comparing a route traced by a cannula or catheter inserted by a user performing the simulated procedure for a preferred or predefined path, a score can be calculated. For example, the smaller the deviation from an ideal or preferred route, the higher the score. A time to complete a predefined part of an operation can be pre-configured and the time it takes a user to complete such a part can be recorded (for example, controller 131 can start a timer when an element is located in a first location and stop the time stamp when the element reaches a second location). Consequently, a score can be calculated based on the time the user needed to complete the task. Any other aspects of a simulated procedure can be used to calculate or compute a score. For example, a preferred location or placement of an element can be calculated or can be indicated or determined by an expert and the score can consider information about adverse events such as perforation of the vessel, reaching nerves and the amount of blood lost during operation. By determining a location or placement of an element (for example, using signals transmitted by one or more transmitters connected to the element and received by a stationary receiver) and comparing the location and placement with the preferred location and placement, a score can be calculated and recorded. [0049] A score can be based on any parameter related to the performance of a procedure. For example, a score can be related to, or based on, a route used to reach a location within an anatomical structure, the time it took to perform a task, an element placement and the like. For example, a performance of a procedure by an expert can be recorded and any aspect of a performance of the procedure by a person in training can be compared or otherwise related to the recorded procedure as performed by the expert. In addition or alternatively, thresholds (for example, related to time, location, etc.) can be configured and the performance of a procedure can be scored by comparing parameters (for example, a duration) with a threshold. Other parameters or aspects of a score can be, for example, entering specific areas or losing specific areas. For example, a specific anatomical area, region or organ can be marked as "prohibited" and the presence of a medical instrument, element or hand or finger of the user in such an area, region can be flagged and / or affect a score. Similarly, a position or orientation of a tool, element, hand or finger, for example, with respect to a doll that deviates by more than a predefined threshold or other criteria can affect a score. [0050] Additional reference is made to Figure 7 which shows a captured image of the screen, exemplary, including an exemplary presentation of a report 710 including scores according to the modalities of the invention. A score can be presented to a user and can be saved. Consequently, a user's progress can be monitored and / or evaluated, for example, by observing a number of scores calculated during different performances of a procedure. As shown, a report can include scores for anatomical finger recognition and for use of a guide. For example, and as shown, scores related to anatomical finger recognition can cover the percentage of the left (and right) portion of the rectovaginal space and counts related to the use of a guide can be left posterior (and right) anomalies, maximum anomaly from the ideal posterior left (and right) route. Other scores can be based on, or be related to, for example, a percentage of organs related to the previous procedure that have been correctly palpated, a percentage of coverage of Arcus tendineus fascia pelvis (ATFP), a percentage of palpation coverage of the total paravesical space , a number of times a guide has deviated from an ideal route (for example, predetermined or preconfigured) a distance between two or more elements placed or inserted (for example, cannulas). [0051] The provided feedback may comprise alerts, alarms or any other indication or information. For example, an alarm can be triggered if a user conducts a cannula very close to a blood vessel. For example, a 3D model of mannequin 170 can include any information or data including the location of blood vessels. Consequently, provided with any relevant anatomical information and any parameter related to the location of an instrument, finger or element, controller 131 can determine that an alarm should be triggered. [0052] Although certain characteristics of the invention have been illustrated and described here, many modifications, substitutions, alterations and equivalents can occur to those skilled in the art. Therefore, it should be understood that the appended claims are intended to cover all such modifications and changes comprised in the true essence of the invention.
权利要求:
Claims (17) [0001] 1. Method to simulate a medical procedure, the method characterized by understanding: installing a physical model of an anatomical structure on a physical mannequin; automatically synchronize a digital three-dimensional model: of the physical mannequin, part of the physical mannequin, or the physical model of the anatomical structure, with the physical model of the anatomical structure installed on the physical mannequin based on a signal received from a component installed in the physical model of the anatomical structure, the synchronization comprising the generation of the digital three-dimensional model based, at least, on the received signal; display a digital three-dimensional model of a physical medical instrument or a digital three-dimensional model of a finger with the synchronized digital three-dimensional model: of the physical mannequin, part of the physical mannequin, or the physical model of the anatomical structure, on a screen connected to a device computing; place a stationary transceiver at a known distance: from the physical manikin, from the part of the physical manikin, or from the physical model of the anatomical structure, and in a known orientation with respect to: the physical manikin, the part of the physical manikin, or the physical model of the anatomical structure; attach a mobile transceiver to a physical medical instrument or a finger; analyze signals received at one of the stationary transceiver and the mobile transceiver from the other between the stationary transceiver and the mobile transceiver; determine a location of the physical medical instrument or finger in relation to a location: the physical mannequin, the part of the physical mannequin, or the physical model of the anatomical structure, based on the analyzed signals and the known distance and orientation of the stationary transceiver with respect to : the physical mannequin, the part of the physical mannequin, or the physical model of the anatomical structure; manipulate the synchronized digital three-dimensional model: the physical mannequin, the part of the physical mannequin, or the physical model of the anatomical structure and the digital three-dimensional model of the physical medical instrument or the digital three-dimensional model of the finger, according to the determined location of the medical instrument physical or finger; and after manipulation, update the screen of the synchronized digital three-dimensional model based on the manipulation. [0002] 2. Method, according to claim 1, characterized by the fact that it comprises: dynamically updating, in real time, the display of the digital three-dimensional model: of the physical mannequin, the part of the physical mannequin, or the physical model of the anatomical structure and the digital three-dimensional model of the medical instrument or digital three-dimensional model of the finger, to reflect the location and position: of the physical manikin, part of the physical manikin, or the physical model of the anatomical structure, and the medical instrument or the finger. [0003] 3. Method, according to claim 1, characterized by the fact that it comprises: determining, based on the analyzed signals, an orientation of the physical medical instrument or the finger; and display the synchronized digital three-dimensional model: of the physical mannequin, part of the physical mannequin, or the physical model of the anatomical structure and the digital three-dimensional model of the medical instrument or the digital three-dimensional model of the finger, according to the orientation of the physical medical instrument or the finger. [0004] 4. Method, according to claim 1, characterized by the fact that the simulation of the medical procedure additionally comprises placing an element inside: the physical mannequin, the part of the physical mannequin, or the physical model of the anatomical structure. [0005] 5. Method, according to claim 4, characterized by the fact that the medical procedure is a pelvic floor reconstruction intervention procedure, and in which the element is a mechanical support or connecting material. [0006] 6. Method, according to claim 1, characterized by the fact that the simulation of the medical procedure comprises providing a user with feedback related to the performance of the medical procedure. [0007] 7. Method, according to claim 1, characterized by the fact that it comprises storing, in a database, a record of the simulated medical procedure. [0008] 8. Method, according to claim 4, characterized by the fact that it comprises: displaying a digital three-dimensional model of the element on the screen; analyze signals received from the stationary transceiver placed at a distance: from the physical manikin, from the part of the physical manikin, or from the physical model of the anatomical structure, and from a transceiver attached to the element; determine a location of the element based on the analyzed signals; and manipulate the digital three-dimensional model of the element according to the location of the element. [0009] 9. Method, according to claim 1, characterized by the fact that it comprises determining a score related to the performance of the medical procedure based at least on the determined location of the physical model of the medical instrument or finger. [0010] 10. Article comprising a computer storage medium, non-transitory, which has stored in the same instructions that, when executed by a processor on a processing platform, is configured to receive signals received on one of a stationary transceiver placed at a known distance from : a physical mannequin, a part of the physical mannequin, or a physical model of an anatomical structure installed on the physical mannequin, and in a known orientation with respect to: the physical mannequin, the part of the physical mannequin, or the physical model of the structure anatomical; and a mobile transceiver attached to a physical medical instrument or a finger, the signals having been transmitted by the other between the stationary transceiver and the mobile transceiver, characterized by the instructions when executed by the processor, result in: automatically synchronizing a digital three-dimensional model: of the physical mannequin , a part of the physical mannequin, or the physical model of the anatomical structure installed on the physical mannequin based on a signal received from a component installed on the physical model of the anatomical structure, the synchronization comprising the generation of the digital three-dimensional model based on, at least on the received signal; display a digital three-dimensional model of a physical medical instrument or a finger with the synchronized digital three-dimensional model: of the physical mannequin, part of the physical mannequin, or the physical model of the anatomical structure, on a screen connected to a computing device; analyze the signals received in one of a stationary transceiver disposed at a distance from: the physical manikin, the part of the physical manikin, or the physical model of the anatomical structure, and a mobile transceiver fixed to a physical medical instrument or a finger from another among the stationary transceiver disposed at a distance: from the physical mannequin, from the part of the physical mannequin, or from the physical model of the anatomical structure, and the mobile transceiver fixed to the physical medical instrument or finger; determine a location of the physical medical instrument or finger in relation to a location: the physical mannequin, the part of the physical mannequin, or the physical model of the anatomical structure, based on the analyzed signals and the known distance and orientation of the stationary transceiver with respect to : the physical mannequin, the part of the physical mannequin, or the physical model of the anatomical structure; manipulate the synchronized digital three-dimensional model: the physical mannequin, the part of the physical mannequin, or the physical model of the anatomical structure and the digital three-dimensional model of the physical medical instrument or the digital three-dimensional model of the finger, according to the determined location of the medical instrument physical or finger; and after manipulation, update the screen of the synchronized digital three-dimensional model based on the manipulation. [0011] 11. Article, according to claim 10, characterized by the fact that the instructions, when executed, additionally result in: dynamically updating, in real time, the display of the digital three-dimensional model: the physical mannequin, the part of the physical mannequin, or the physical model of the anatomical structure and the digital three-dimensional model of the medical instrument or the digital three-dimensional model of the finger, to reflect the location and position: of the physical mannequin, the part of the physical mannequin, or the physical model of the anatomical structure, and the instrument doctor or finger. [0012] 12. Article, according to claim 10, characterized by the fact that the instructions when executed additionally result in: determining, based on the analyzed signals, an orientation of the medical instrument or the finger; and display the synchronized digital three-dimensional model: of the physical mannequin, on the part of the physical mannequin, or of the physical model of the anatomical structure; and the digital three-dimensional model of the medical instrument or the digital three-dimensional model of the finger, according to the orientation of the physical medical instrument or the finger. [0013] 13. Article, according to claim 10, characterized by the fact that the instructions when executed additionally result in: analyzing signals received from at least one transceiver attached to an element, in which the signals are indicative of the element's placement : on the physical mannequin, on the part of the physical mannequin, or on the physical model of the anatomical structure; display a digital three-dimensional model of the element on the screen; determine a location of the element: on the physical mannequin, on the part of the physical mannequin, or on the physical model of the anatomical structure, based on the analyzed signals; and manipulate the digital three-dimensional model of the element according to the location of the element. [0014] 14. Article, according to claim 10, characterized by the fact that the instructions when executed additionally result in the provision to a user with feedback related to the performance of the medical procedure. [0015] 15. Method to simulate a medical procedure, the method characterized by understanding: installing a physical model of an anatomical structure on a physical mannequin; automatically synchronize a digital three-dimensional model: of the physical mannequin, of a part of the physical mannequin, or of the physical model of the anatomical structure, based on a signal received from a component installed in a physical model of the anatomical structure; display a digital three-dimensional model of a finger or a user or a digital three-dimensional model of a physical medical instrument with the synchronized digital three-dimensional model: of the physical manikin, part of the physical manikin, or the physical model of the anatomical structure, in a screen connected to a computing device; analyze one or both of: signals received on a stationary transceiver spaced at a known distance: from the physical manikin, from the part of the physical manikin, or from the physical model of the anatomical structure, and in a known orientation with respect to: the physical manikin, the part the physical mannequin, or the physical model of the anatomical structure, from a mobile transceiver associated with the user's finger or the physical medical instrument; and signals received on a mobile transceiver associated with the user's finger or the physical medical instrument of a transceiver spaced at a known distance: from the physical manikin, the part of the physical manikin, or the physical model of the anatomical structure, and in a known orientation with relation: to the physical mannequin, to the part of the physical mannequin, or to the physical model of the anatomical structure; determine a location of a user's finger or physical medical instrument in relation to a location: the physical mannequin, the part of the physical mannequin, or the physical model of the anatomical structure, based on one or both analyzed signals and the known distance and orientation of the stationary transceiver with respect to: the physical mannequin, the part of the physical mannequin, or the physical model of the anatomical structure; manipulate the synchronized digital three-dimensional model: of the physical mannequin, part of the physical mannequin, or the physical model of the anatomical structure, and the digital three-dimensional model of the user's finger or the digital three-dimensional model of the physical medical instrument according to the determined location of the finger of the user or physical medical instrument; and after manipulation, update the screen of the synchronized digital three-dimensional model based on the manipulation. [0016] 16. Method, according to claim 15, characterized by the fact that it comprises: dynamically updating, in real time, the display: of the synchronized digital three-dimensional model of the physical mannequin, of the part of the physical mannequin, or of the anatomical structure and the three-dimensional model the user's finger or the digital three-dimensional model of the physical medical instrument to reflect the location and position: of the physical mannequin, the part of the physical mannequin, or the physical model of the anatomical structure, and the user's finger or the digital three-dimensional model of the physical medical instrument. [0017] 17. Method, according to claim 15, characterized by the fact that the medical procedure is a pelvic examination.
类似技术:
公开号 | 公开日 | 专利标题 BR112013028273B1|2020-12-15|METHOD FOR SIMULATING A MEDICAL PROCEDURE AND ARTICLE US9142145B2|2015-09-22|Medical training systems and methods KR20180058656A|2018-06-01|Reality - Enhanced morphological method KR102105980B1|2020-05-04|Surgical training model for laparoscopic procedures Basdogan et al.2007|VR-based simulators for training in minimally invasive surgery Rhienmora et al.2011|Intelligent dental training simulator with objective skill assessment and feedback US10453360B2|2019-10-22|Ultrasound simulation methods US20170140671A1|2017-05-18|Surgery simulation system and method US20140011173A1|2014-01-09|Training, skill assessment and monitoring users in ultrasound guided procedures US20150004584A1|2015-01-01|Modular patient simulating mannequin and method thereof WO2012160999A1|2012-11-29|Stethoscopy training system and simulated stethoscope US20210233429A1|2021-07-29|Mixed-reality endoscope and surgical tools with haptic feedback for integrated virtual-reality visual and haptic surgical simulation US20170372640A1|2017-12-28|Simulation features combining mixed reality and modular tracking US20140180416A1|2014-06-26|System, method and apparatus for simulating insertive procedures of the spinal region US10828107B2|2020-11-10|Mixed reality training system Tai et al.2018|A high-immersive medical training platform using direct intraoperative data JPWO2017126313A1|2018-11-22|Surgical training and simulation system using biological texture organs Riener et al.2012|VR for medical training US20210327305A1|2021-10-21|System for validating and training invasive interventions WO2020210972A1|2020-10-22|Wearable image display device for surgery and surgical information real-time presentation system Sung et al.2020|Intelligent Haptic Virtual Simulation for Suture Surgery [J] Jardin et al.2008|A medical simulator for subcutaneous contraceptive implant insertion JP2021152636A|2021-09-30|Medical information management server, surgery training apparatus, surgery training system, image transmission method, display method, and program EP3392862A1|2018-10-24|Medical simulations Mendelsohn et al.2020|Transoral Robotic Surgical Proficiency Via Real‐Time Tactile Collision Awareness System
同族专利:
公开号 | 公开日 MX2013012830A|2014-11-26| BR112013028273A2|2017-09-19| WO2012150596A1|2012-11-08| US10354555B2|2019-07-16| US20120282583A1|2012-11-08| CO6862112A2|2014-02-10| EA201291123A1|2013-07-30| MX364876B|2019-05-09|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US3643651A|1970-08-04|1972-02-22|Lorton Lab Ltd|Finger scale for gynecologic and other measurement of body orifice| US5518407A|1993-11-02|1996-05-21|Greenfield; Cathy L.|Anatomically correct artificial organ replicas for use as teaching aids| US5882206A|1995-03-29|1999-03-16|Gillio; Robert G.|Virtual surgery system| US7811090B2|1996-05-08|2010-10-12|Gaumard Scientific Company, Inc.|Interactive education system for teaching patient care| IL123073D0|1998-01-26|1998-09-24|Simbionix Ltd|Endoscopic tutorial system| WO1999042978A1|1998-02-19|1999-08-26|Boston Dynamics, Inc.|Method and apparatus for surgical training and simulating surgery| US6074213A|1998-08-17|2000-06-13|Hon; David C.|Fractional process simulator with remote apparatus for multi-locational training of medical teams| US6113395A|1998-08-18|2000-09-05|Hon; David C.|Selectable instruments with homing devices for haptic virtual reality medical simulation| US6544041B1|1999-10-06|2003-04-08|Fonar Corporation|Simulator for surgical procedures| EP1275098B1|2000-04-12|2005-12-07|Simbionix Ltd.|Endoscopic tutorial system for urology| US7665995B2|2000-10-23|2010-02-23|Toly Christopher C|Medical training simulator including contact-less sensors| US6773263B2|2001-10-09|2004-08-10|Robert J. Nicholls|Medical simulator| TW558689B|2002-08-30|2003-10-21|Univ Taipei Medical|Three-dimensional surgery simulation system and method| US7599730B2|2002-11-19|2009-10-06|Medtronic Navigation, Inc.|Navigation system for cardiac therapies| DE10261673A1|2002-12-31|2004-07-15|Riener, Robert, Dr.-Ing.|Interactive teaching and learning device| US7850456B2|2003-07-15|2010-12-14|Simbionix Ltd.|Surgical simulation device, system and method| US7731500B2|2004-07-08|2010-06-08|Laerdal Medical Corporation|Vascular-access simulation system with three-dimensional modeling| WO2007027101A1|2005-08-29|2007-03-08|Go Virtual Medical Limited|Medical instruction system| US7648367B1|2005-09-23|2010-01-19|Acclarent, Inc.|Anatomical models and methods for training and demonstration of medical procedures| CN102662597B|2005-11-14|2016-03-09|意美森公司|Edit the system and method for the model for the physical system emulated| US7553159B1|2006-05-12|2009-06-30|Ams Research Corporation|Abdominopelvic region surgical training model| US8500451B2|2007-01-16|2013-08-06|Simbionix Ltd.|Preoperative surgical simulation| US8543338B2|2007-01-16|2013-09-24|Simbionix Ltd.|System and method for performing computerized simulations for image-guided procedures using a patient specific model| US8961553B2|2007-09-14|2015-02-24|Crosstrees Medical, Inc.|Material control device for inserting material into a targeted anatomical region| US9282958B2|2007-12-28|2016-03-15|Boston Scientific Scimed, Inc.|Devices and method for treating pelvic dysfunctions| EP2248122A1|2008-02-25|2010-11-10|Inventive Medical Limited|Medical training method and apparatus| US20100167250A1|2008-12-31|2010-07-01|Haptica Ltd.|Surgical training simulator having multiple tracking systems| WO2010083272A1|2009-01-15|2010-07-22|Simquest Llc|Interactive simulation of biological tissue| US20100261956A1|2009-04-09|2010-10-14|Minnesota Medical Development, Inc.|Apparatus and Method for Pelvic Floor Repair in the Human Female| US20120189998A1|2009-06-29|2012-07-26|Koninklijke Philips Electronics N.V.|Tumor ablation training system| US20110021869A1|2009-07-24|2011-01-27|Hilary John Cholhan|Single-incision minimally-invasive surgical repair of pelvic organ/vaginal prolapse conditions| EP2489028A1|2009-10-15|2012-08-22|Douglas W. Miyazaki|Pelvic surgery training model| US10580325B2|2010-03-24|2020-03-03|Simbionix Ltd.|System and method for performing a computerized simulation of a medical procedure| US9251721B2|2010-04-09|2016-02-02|University Of Florida Research Foundation, Inc.|Interactive mixed reality system and uses thereof| WO2011146716A1|2010-05-19|2011-11-24|C.R. Bard, Inc.|Demonstration mannequin for simulating catheter malposition| US20120100517A1|2010-09-30|2012-04-26|Andrew Bowditch|Real-time, interactive, three-dimensional virtual surgery system and method thereof|US6850788B2|2002-03-25|2005-02-01|Masimo Corporation|Physiological measurement communications adapter| US8840549B2|2006-09-22|2014-09-23|Masimo Corporation|Modular patient monitor| US9161696B2|2006-09-22|2015-10-20|Masimo Corporation|Modular patient monitor| US11264139B2|2007-11-21|2022-03-01|Edda Technology, Inc.|Method and system for adjusting interactive 3D treatment zone for percutaneous treatment| US9153112B1|2009-12-21|2015-10-06|Masimo Corporation|Modular patient monitor| EP3392863B1|2010-10-01|2020-04-22|Applied Medical Resources Corporation|Portable laparoscopic trainer| JP5972533B2|2011-06-15|2016-08-17|東芝メディカルシステムズ株式会社|Image processing system and method| AU2012277833A1|2011-06-28|2014-01-16|Alk Ag|A method, a device and a computer program product for training the use of an auto-injector| US9943269B2|2011-10-13|2018-04-17|Masimo Corporation|System for displaying medical monitoring data| US9436645B2|2011-10-13|2016-09-06|Masimo Corporation|Medical monitoring hub| AU2012325987B2|2011-10-21|2017-02-02|Applied Medical Resources Corporation|Simulated tissue structure for surgical training| US10307111B2|2012-02-09|2019-06-04|Masimo Corporation|Patient position detection system| US10149616B2|2012-02-09|2018-12-11|Masimo Corporation|Wireless patient monitoring device| CA2880277A1|2012-08-03|2014-02-06|Applied Medical Resources Corporation|Simulated stapling and energy based ligation for surgical training| US20140051049A1|2012-08-17|2014-02-20|Intuitive Surgical Operations, Inc.|Anatomical model and method for surgical training| US9749232B2|2012-09-20|2017-08-29|Masimo Corporation|Intelligent medical network edge router| EP2907125B1|2012-09-26|2017-08-02|Applied Medical Resources Corporation|Surgical training model for laparoscopic procedures| ES2864157T3|2012-09-27|2021-10-13|Applied Med Resources|Surgical training model for laparoscopic procedures| US10679520B2|2012-09-27|2020-06-09|Applied Medical Resources Corporation|Surgical training model for laparoscopic procedures| EP3483863B1|2012-09-27|2021-04-21|Applied Medical Resources Corporation|Surgical training model for laparoscopic procedures| ES2720490T3|2012-09-28|2019-07-22|Applied Med Resources|Surgical training model for transluminal laparoscopic procedures| EP2901439A1|2012-09-28|2015-08-05|Applied Medical Resources Corporation|Surgical training model for laparoscopic procedures| BR112015009608A2|2012-10-30|2017-07-04|Truinject Medical Corp|cosmetic or therapeutic training system, test tools, injection apparatus and methods for training injection, for using test tool and for injector classification| US9792836B2|2012-10-30|2017-10-17|Truinject Corp.|Injection training apparatus using 3D position sensor| AU2013375297B2|2013-01-23|2017-08-03|Boston Scientific Scimed, Inc.|Surgical training system| EP2962291A1|2013-03-01|2016-01-06|Applied Medical Resources Corporation|Advanced surgical simulation constructions and methods| CA2912069C|2013-05-15|2022-01-11|Applied Medical Resources Corporation|Hernia model| CA2914952A1|2013-06-18|2014-12-24|Applied Medical Resources Corporation|Gallbladder model| EP3025322B1|2013-07-24|2018-09-05|Applied Medical Resources Corporation|First entry model| US10198966B2|2013-07-24|2019-02-05|Applied Medical Resources Corporation|Advanced first entry model for surgical simulation| US10070932B2|2013-08-29|2018-09-11|Given Imaging Ltd.|System and method for maneuvering coils power optimization| US10832818B2|2013-10-11|2020-11-10|Masimo Corporation|Alarm notification system| DK177984B9|2013-11-12|2015-03-02|Simonsen & Weel As|Device for endoscopy| CN112201131A|2013-12-20|2021-01-08|直观外科手术操作公司|Simulator system for medical procedure training| EP3111438B1|2014-01-17|2018-12-19|Truinject Medical Corp.|Injection site training system| US10290231B2|2014-03-13|2019-05-14|Truinject Corp.|Automated detection of performance characteristics in an injection training system| WO2015148817A1|2014-03-26|2015-10-01|Applied Medical Resources Corporation|Simulated dissectible tissue| US9911365B2|2014-06-09|2018-03-06|Bijan SIASSI|Virtual neonatal echocardiographic training system| EP2988289A1|2014-08-22|2016-02-24|Moog B.V.|Medical procedure simulator| CA2967586A1|2014-11-13|2016-05-19|Applied Medical Resources Corporation|Simulated tissue models and methods| WO2016089706A1|2014-12-01|2016-06-09|Truinject Medical Corp.|Injection training tool emitting omnidirectional light| JP6806684B2|2015-02-19|2021-01-06|アプライド メディカル リソーシーズ コーポレイション|Simulated tissue structure and method| ES2716924T3|2015-05-14|2019-06-18|Applied Med Resources|Synthetic tissue structures for training and electrosurgical stimulation| CA2988767A1|2015-06-09|2016-12-15|Applied Medical Resources Corporation|Hysterectomy model| ES2824529T3|2015-07-16|2021-05-12|Applied Med Resources|Simulated dissectable tissue| CA2993197A1|2015-07-22|2017-01-26|Applied Medical Resources Corporation|Appendectomy model| WO2017040700A2|2015-08-31|2017-03-09|Masimo Corporation|Wireless patient monitoring systems and methods| JP6916781B2|2015-10-02|2021-08-11|アプライド メディカル リソーシーズ コーポレイション|Hysterectomy model| EP3365049A2|2015-10-20|2018-08-29|Truinject Medical Corp.|Injection system| EP3378053A1|2015-11-20|2018-09-26|Applied Medical Resources Corporation|Simulated dissectible tissue| WO2017151441A2|2016-02-29|2017-09-08|Truinject Medical Corp.|Cosmetic and therapeutic injection safety systems, methods, and devices| US10849688B2|2016-03-02|2020-12-01|Truinject Corp.|Sensory enhanced environments for injection aid and social training| US10648790B2|2016-03-02|2020-05-12|Truinject Corp.|System for determining a three-dimensional position of a testing tool| US10922894B2|2016-06-06|2021-02-16|Biodigital, Inc.|Methodology and system for mapping a virtual human body| CA3028980A1|2016-06-27|2018-01-04|Applied Medical Resources Corporaton|Simulated abdominal wall| US10617302B2|2016-07-07|2020-04-14|Masimo Corporation|Wearable pulse oximeter and respiration monitor| US11076777B2|2016-10-13|2021-08-03|Masimo Corporation|Systems and methods for monitoring orientation to reduce pressure ulcer formation| WO2018118858A1|2016-12-19|2018-06-28|National Board Of Medical Examiners|Medical training and performance assessment instruments, methods, and systems| US10650703B2|2017-01-10|2020-05-12|Truinject Corp.|Suture technique training system| WO2018136901A1|2017-01-23|2018-07-26|Truinject Corp.|Syringe dose and position measuring apparatus| KR20190112817A|2017-02-14|2019-10-07|어플라이드 메디컬 리소시스 코포레이션|Laparoscopic Training System| US10847057B2|2017-02-23|2020-11-24|Applied Medical Resources Corporation|Synthetic tissue structures for electrosurgical training and simulation| RU178470U1|2017-08-24|2018-04-04|Общество с ограниченной ответственностью "ЭНСИМ"|DEVICE FOR PREOPERATIVE SIMULATION OF SURGICAL PROCEDURE| EP3782165A1|2018-04-19|2021-02-24|Masimo Corporation|Mobile patient alarm display| US11270597B2|2018-05-01|2022-03-08|Codescribe Llc|Simulated reality technologies for enhanced medical protocol training|
法律状态:
2020-03-10| B06U| Preliminary requirement: requests with searches performed by other patent offices: suspension of the patent application procedure| 2020-09-15| B09A| Decision: intention to grant| 2020-12-15| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 30/04/2012, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US13/098,819|2011-05-02| US13/098,819|US10354555B2|2011-05-02|2011-05-02|System and method for performing a hybrid simulation of a medical procedure| PCT/IL2012/050151|WO2012150596A1|2011-05-02|2012-04-30|System and method for performing a hybrid simulation of a medical procedure| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|