专利摘要:
ROBOTIZED SURGICAL SYSTEM WITH IMPROVED CONTROL Robotic surgical system (10) comprising at least one robotic arm (11) that acts under the control of a control console (12) for the surgeon. The console (12) comprises an eye tracking system (21) for detecting the direction of the surgeon's gaze and for entering commands according to the instructions of the detected gaze. The console (22) advantageously comprises a screen (23) with at least one zone (23) for viewing the operating field and, among the commands that can be carried out depending on the direction of the eye, there is advantageously an automatic command for enabling or disabling the movement of a robotic arm (11) when the direction of gaze detected falls within or outside said zone (23) of the screen.
公开号:BR112012025516B1
申请号:R112012025516-9
申请日:2011-04-01
公开日:2020-11-17
发明作者:Emilio Ruiz Morales;Damien Brasset;Paolo Invernizzi
申请人:Transenterix Italia, S.R.L;
IPC主号:
专利说明:

The present invention relates to a robotic surgical system, specifically for invasive mini-operations, such as endoscopies.
In the prior art, robotic systems have been proposed to perform surgical operations, specifically laparoscopic operations. These robotic systems comprise one or more robotic arms that are controlled by a surgeon using a special console. The said console generally comprises a television screen, on which the images of the operative field (usually recorded by means of an endoscopic telecamera) are shown in addition to appropriate manipulators, through which the surgeon performs the movements of the arms robotic. The manipulators can be of the "reactive" type so that the surgeon can also feel with the hands the forces exerted by the robotic surgical instruments on the patient's tissues.
Normally, one of the robotic arms moves the camera so that the surgeon is able to vary his viewing angle during the operation, while one or more robotic arms move the operating instruments (tweezers, scissors, scalpels, vacuum cleaners, etc.) with which the surgeon performs the intended operation.
It is still a bit complicated for the surgeon to control the system due to the number of controls available and various operating parameters which must be supervised by the surgeon. In addition to the movement of the robotic arms, the surgeon must also operate the surgical instruments mounted on the arms (for example, control of the opening and closing of the clamps or scissors) and control the lens of the camera, in order to have images closer or more distant of the operative field, as needed, at any given time.
Often, while the surgeon looks at the image shown by the telecamera, he must simultaneously operate the two robotic arms with the surgical instruments, so that both hands are busy controlling the corresponding manipulators.
This prevents the surgeon from being able to use other functions of the system, unless he removes one of his hands from the controls of the surgical implements.
Systems have also been proposed where certain functions can be activated using the pedals or additional control devices provided in the manipulators, so that they can be operated with a finger, without having to release the manipulator properly. All of these systems, however, are often not intuitive and, in any case, distract the surgeon's attention when performing a precise maneuver with surgical manipulators.
An additional problem is that the surgeon's attention, for one reason or another (for example, in order to perform the additional functions or to view the operating parameters shown on the console or on other instruments outside the operative field), may be diverted from the screen of the operative field. Under these conditions, the surgeon may not detect incorrect or dangerous movements of the robotic surgical instruments, which may also be involuntarily performed by the surgeon while holding the manipulators.
In order to avoid such situations it was proposed to use a pedal that the surgeon needs to operate to activate / deactivate the movement of the robots, in order to be able to deactivate the movement before removing his hands from the manipulators and directing his attention away from the operative field. The operation of this pedal at the right time will depend on the surgeon.
The main objective of the present invention is to provide a robotic surgical system and a control method that allow a better and safer control by the surgeon, thus ensuring, among other things, better operational safety of the system.
In view of this objective, the idea proposed in this invention is to provide a robotic surgical system comprising at least one robotic arm that acts under the control of a control system intended for the surgeon, characterized by the fact that the console comprises a tracking system eyepiece to detect the direction of the surgeon's gaze and to trigger commands depending on the direction of the detected gaze.
An additional idea according to the invention is to provide a method for controlling a robotic surgical system comprising, at least, a robotic arm that acts under the control of a control console intended for the surgeon, in which the direction of the patient's gaze surgeon is detected and functions of the surgical system are controlled depending on the direction of the look.
In order to more clearly illustrate the innovative principles of the present invention and their advantages compared to the prior art, an example of the application of these principles will be described below, with the aid of the attached drawings.
In the drawings: - Figure 1 shows a schematic view of a surgical system according to the invention; - Figure 2 shows the block diagram of the system according to the invention.
With reference to the figures, figure 1 shows, schematically, a robotic surgical system, usually indicated by 10, supplied according to the invention.
The system 10 comprises at least one robotic arm 11 which acts under the control of a control console 12 and operated by the surgeon who is, for example, in a comfortable working position. The console can also be mounted on wheels so that it can be easily moved.
The robotic arm must be of a substantially known type and suitable for this specific use. In the system 10 shown in the present document the robotic arms are three in number, although a different number can easily be used.
The robotic arm (or each robotic arm) ends in a claw that is capable of supporting and operating a surgical instrument for use on a patient 13. This instrument will generally be a known instrument for endoscopic and, specifically, laparoscopic operations. One of the instruments is advantageously a telecamera 14 that records the operative field (in this case specifically, the patient's interior), whereas the other instruments can be appropriately known surgical instruments 15, 16 (with tweezers, aspirators, scalpels, etc.) . Robotic arms, instruments and actuators for maneuvering the instruments will not be described and illustrated further in this document, as they are known and can be easily imagined by one skilled in the art. The surgical operations possible with this system, as well as the methods for its execution, will also not be described further in this document, since they can be easily imagined by a person skilled in the art.
The robotic arms are operated by a suitable and known electronic control unit 30, in order to execute the movements inserted through the console 12. The unit 30 will receive the high level movement commands (for example, the desired position and inclination of the instrument supported by the robot) and execute them, converting them into the corresponding signal sequences to be sent to the individual motors of the articulations of the robotic arm. The robotic arms can also be provided with known force sensors, used both by the unit 30 to prevent damage due to the collision of the arm with objects in the workspace, and in order to provide adequate feedback to the console operator, as will be explained below . The connections between the console and the robot control unit can advantageously be of the optical fiber type, in order to reduce the possibility of interference with the transmitted signals.
Suitable robotic arms and control systems are described, for example, in W02007 / 088208, W02008 / 049898 and W02007 / 088206.
In order to perform the movements of the surgical instruments, the console can advantageously comprise a pair of known manipulators 17, 18 that can be held by the surgeon, their movements being properly reproduced by the surgical instruments, through the movement of the robotic arms 11. Advantageously, the manipulators can be of the known "reactive" type (that is, with a tactile interface that provides weighted movement and tactile sensors) in such a way that the surgeon is also able to sense, in the manipulators, the forces exerted by the surgical instruments robotized in patients' tissues. Suitable tactile interfaces are well known in the field of robotic endoscopic surgery.
Normally, each manipulator will operate a robotic arm. Advantageously, in the case of more than two arms, a control will be provided on the console, in order to be able to assign, as necessary, each manipulator to a desired robotic arm, as will be explained later. A keyboard 19 and another command input device, for example, also comprising a pedal device 20, can be considered. Device 20 may comprise one or more pedals for activating, for example, the power supply for monopolar and bipolar instruments, the functions of irrigation and suction, if provided for a specific instrument, etc.
Console 12 also comprises an eye movement tracking system 21 or so-called "eye tracking" for detecting the direction of the surgeon's gaze towards the console and for controlling the surgical system also dependent on detecting the direction of the gaze. In this way, the surgeon can control the functions of the system through eye movement.
Advantageously, the console includes a video screen 22, with at least one zone 23 to show a view of the operating field. This view is provided by the telecamera 14, which can be supplemented (for example, in order to provide the surgeon with additional information and / or improve his understanding of the operative field), with artificial images generated by a computerized system 24, known properly, for management console, which will be further described below.
As will become clear below, the computerized system 24 generates and controls a human-machine interface (HMI) that allows the surgeon to interact with the surgical system.
For conventional two-dimensional viewing, the image provided by the endoscope can be directly viewed on the screen .22.
Advantageously, however, a three-dimensional system for visualizing the operating field can be provided. In that case, the camera 14 may be of a known stereoscopic type, which provides suitable signals 25, 26 representing two different images "on the right" and "on the left" which are spatially recorded in compensation.
The signals 25, 26 are processed by an electronic device 27, so that the three-dimensional image can be shown to the surgeon through a stereoscopic visualization system.
Among the several known stereoscopic visualization systems, it has been found that a polarized filter system is specifically advantageous; in this system the electronic device 27 includes a known stereo mixer that alternates lines of the images on the right and left received by the telecamera in order to show them integrated in the viewing area 23 of the screen.
Odd and even horizontal lines alternating in the screen image thus alternately represent the lines of the images on the right and left recorded by the camera.
A known filter provided with two different polarization modes for the integrated integrated lines and odd integrated lines is provided in area 23 for the integrated image display. In order to view the image, the surgeon wears glasses 28 with the two polarized lenses corresponding to the two polarization modes of the filter on the screen, in order to direct the right eye only on the lines of the image belonging to the original right image recorded by telecamera, while the left eye receives only the image lines that belong to the original left image recorded by the telecamera.
Thus, it is possible to show the surgeon the desired three-dimensional image of the operative field.
If desired, using a similar procedure, artificial stereoscopic images produced by the computerized system 24 can also be presented in three-dimensional form.
In any case, using the tracking system 21, it is possible to detect the direction of the surgeon's gaze towards the screen 22 and define which area of the screen he is looking at or not. The use of a three-dimensional visualization system with polarized glasses does not interfere with the tracking system. In addition, glasses with polarized lenses for three-dimensional viewing can be easily designed to be compatible with normal spectacle lenses.
It has been found to be specifically advantageous for the tracking system to send a command that disables the movement of the robotic arms when the direction of looking outside the screen or, at least, out of the area of the screen that reproduces the operative field is detected. In this way, a safety system is provided that prevents movement of the arms, without direct supervision by the surgeon.
A function called "inactive man" is thus obtained to activate the robot and keep it activated while the user is looking at the screen.
Advantageously, for additional security, an additional control device can be provided (for example, a push button 31 on a handle or pedal device 20), where double consent is required to enable the movement commands, so that, in order to reactivate the movement, the surgeon must look at the image on the screen and also provide a manual authorization command, while the movement can be interrupted simply by looking away from the image surgeon.
Advantageously, screen 22 shows, in addition to the endoscope view, also at least a part of the human-machine interface. The computerized system 24, which provides the interface, shows on a screen selection areas 29 associated with the system commands. Advantageously, the selection areas can be arranged on the same screen 22 that shows the view of the operative field. For example, these selection areas can be arranged at the bottom of the screen, below area 23 for viewing the operating field. The tracking system calculates the direction of the surgeon's gaze and performs the selection of commands associated with the selection area when the gaze direction is detected in that area.
The commands associated with the various selection areas can be of any type considered useful. For example, these commands can be chosen from those that are often used when performing a robotic surgery operation.
It has been found to be specifically advantageous (specifically, when the console comprises two operating manipulators and more than two robotic arms), if the commands associated with the selection areas comprise the commands that assign the manipulators to the robotic arms.
The surgeon can thus switch control of the various robotic arms on the two manipulators, without releasing the manipulators, however, instead, simply by looking at the corresponding selection areas. For example, the surgeon can temporarily pass control of the arm with the telecamera to modify the view of the operative field and then quickly return to the control of the robotic arm, with which he was operating.
For additional security, the console can advantageously comprise a device for inserting a special command confirming the execution of the command associated with the displayed selection area. This device can advantageously be a push button 31, which is arranged on one or both manipulators, so as to be pressed, for example, using the thumb of the hand holding the manipulator. Thus, it is possible to easily confirm the actions activated by the eyes, through the eye tracking system, for example, in order to select a robot to be associated with the manipulator, opening / closing the surgical instruments and modifying the settings of the robot being manipulated.
Another use of the push button can also be to control the degree of freedom of a rotation movement on the instrument (if available).
It is also possible to foresee, advantageously, that the process of designating a robot can be carried out by visually selecting the image of the new robot to be assigned, confirming the selection, using the push button and then dragging the selected image to the position where the image of the robot currently assigned to the right or left control is shown. Dragging is performed by keeping the button pressed and directing the eye to the robot's position. In order to complete the dragging operation, the push button must be released while the eye is kept focused on the previously indicated area.
The eye tracking system can be one of many types that are known properly. However, an eye tracking system that has been found to be specifically advantageous is one that comprises at least one telecamera for recording the image of at least the eyes of the surgeon and devices for calculating the direction of the look depending on the image obtained.
In particular, as shown schematically in figure 2, the tracking system 21 can comprise two telecameras 32, 33 which are arranged next to each other at an appropriate distance, in order to record two spatially compensated images of the surgeon's eyes. The calculation device (for example, comprising a properly programmed microprocessor) present in the tracking system 21 can thus perform a triangulation of the direction of the look depending on the comparison of the two recorded images. Again, advantageously, the tracking system can also comprise an infrared light source 34 for infrared illumination of the eyes, which would facilitate their detection in the recorded image.
Advantageously, the eye tracking system can be integrated with the monitor so that if it is moved, the eye tracking can continue to operate correctly.
Still referring to Figure 2, a block diagram of a possibly advantageous embodiment of the console is shown schematically. In this embodiment, the system is divided for clarity into three main functional blocks or groups.
The first block, indicated by 40, comprises the components that are directly involved in the movement of the robotic arms. Block 40 contains a first industrial computer 41, properly known, provided with an operating system in real time (for example, RT-LINÚX) for carrying out the commands associated with the control of the robots in a predetermined time provided. Computer 41 is connected to the robotic control unit (or units) 30 through communications network 42. Computer 41 receives movement commands from manipulators 17 and 1-8, sending them to robots and sending signals for the operation of the reactive devices 43 of the manipulators for tactile feedback. Those manual controls that require an immediate response from the system, such as the pedals 20 if they are used to send, among other things, commands to interrupt the robot's movement, are also advantageously connected to the computer 41.
The second functional block, which is indicated by 44, comprises a second industrial computer 45 that produces and controls the human-machine interface (HMI) which does not strictly require real-time operation. The eye tracking system 21, the keyboard 19 (when needed) and the other interface controls are connected to this second computer. The computer 45 also produces the artificial video images that will be reproduced on the screen 22 (for example, the visual control areas 31) and can control any functions necessary to vary the enlargement of the image of the operative field.
Computers 41 and 45 form computerized system 24 for controlling the console.
Computers 41 and 45 and the robotic control unit 30 can communicate with each other via network 42. The application of the HMI managed by computer 45 thus allows robots to be assigned to manipulators, as well as the visualization of data related to each robot, such as the instruments currently assembled, the state of movement, the return state, the position of the rotational supports of the instruments inserted inside the patient's body, the condition of the robot, the state of the robotic connection, any conditions of emergency, etc.
The third functional block indicated by 46 deals with the reproduction of images on the screen, providing, for example, the PiP (Picture-in-Picture) frame-by-frame function, using the signal provided by the camera 14 recording the operating field and the image signal. 47 produced to display the HMI interface. The third block also includes stereo mixer 27 for three-dimensional viewing.
Advantageously, a monitor 22 has been designed with two separate inputs for the PiP function. The main source is displayed in full screen mode, for example, via a DVI connection, while at the same time, another video input (for example, a VGA connection) is presented as an insertion window. The main source (full screen) consists of a two-dimensional or three-dimensional view of the endoscope, which is received from the endoscopy system. The second source comes from computer 45, which produces the human-machine interface (HMI).
During the calibration of the eye tracking system 21, the full screen view can also be switched dynamically (for example, by means of serial commands sent from the HMI application to the monitor) to the video signal produced by the computer 45.
Advantageously, the console can also comprise a system for detecting the distance between the screen and the surgeon's eyes, in order to vary the increase in the image of the operative field that is shown on the screen depending on a variation of the detected distance.
Thus, the surgeon can intuitively perform the enlargement of the image simply by moving the face towards the screen and, vice versa, increasing the viewing area of the operative field, thus reducing the increase when moving the face off the screen.
The distance detection system can be obtained in several ways, properly known, for example, using telemetric ultrasound measuring devices.
Advantageously, however, the eye tracking system 21 can be used, due to the stereoscopic recording system that allows calculating, by means of triangulation, the distance from the surgeon's face. This, together, the function of the associated eye detection of the eye tracking system, allows the execution of an accurate measurement of the real distance from the point of view of the surgeon, from the screen.
In addition to an enlargement, the system can also produce an image shift, for example, in order to center the image, or shift it, by looking, to the right, to the left, up or down.
When the operative field is recorded by a telecamera, preferably an endoscopic telecamera, mounted on one of the robotic arms, it turns out to be advantageous that the gaze detection tracking system also allows the control of the movement of this telecamera.
When this function is activated (for example, by introducing, through a visual selection of a suitable area 29, the associated activation command), the movement of the eyes over the image of the operative field causes the movement of the robotic arm, in order to shift, and advantageously center the focused area on the screen. The control of the actual movement of the camera can also be performed only after pressing a confirmation pedal or push button, as described previously in this document. In this way, the surgeon is free to move his eyes over the image without moving the viewing frame, unless the confirmation pedal or push button is simultaneously pressed. If the movement and magnification functions described above are combined, the system makes it very easy to control eye movements by moving from the viewing block on the screen, while eye movements in or out of the screen increase or decrease the image shown.
When the three-dimensional visualization of the operating field is used, the remote sensing system can also be used to signal the surgeon when he is within the optimal distance range of the screen.
In fact, in general, three-dimensional systems have an optimal distance range from the screen where the three-dimensional effect is best.
In addition, the combination of the three-dimensional system and the eye tracker imposes certain limitations with regard to the position and distance from the screen, said limitations depending on the position of the surgeon, the tracker and the viewing device.
The application of the console's HMI can be adjusted to indicate to the surgeon, by means of several acoustic and / or optical systems known when it is located in the optimum position in relation to the screen. In addition, it is also possible to provide a function that indicates whether the distance from the eye tracker is adequate.
When the three-dimensional system and the eye tracker are used together, the appropriate workspace can be the same for both and the same indicator can perform both functions.
The working area of the eye tracker 21 will generally be chosen to be much larger than that for optimal viewing of the three-dimensional image. For example, it has been found that an operating track of the tracker between 40 and 75 cm is advantageous, with the possibility of controlling the eyes within a vertical angle of + 30 ° and -10 °. The optimal three-dimensional visualization is obtained at 60-70 cm from the screen (in this interval, the information will be perfectly differentiated between the right eye and the left eye) and, therefore, are well within the area of operation of the tracker. However, it will still be possible to have a three-dimensional visualization outside this range, as long as the surgeon respects the vertical limit. At a point beyond the upper and lower ends of the screen, three-dimensionality is lost.
At this point, it is clear how the predefined objectives were achieved. Using the control system and the process described above, it is possible to control the robotic arms with tactile detection, display the view provided by the endoscope in two or three dimensions, together with the application of HMI and activate certain functions using an eye tracker. Due to the use of the eye tracking system, several interesting control possibilities can be obtained. Firstly, among the various functions associated with the eye tracking system, there is that of being able to safely prevent the movement of the robotic arms if the surgeon is not looking at the image of the operative field, with the movement of the robotic arm being prevented or automatically allowed, when the direction of the detected eye does not fall or fall within the predetermined zone of the screen.
In addition, the application of HMI is intuitive and easy to use since it can be controlled by the eye of the surgeon (with or without a confirmation activation device). The main advantages are that the surgeon is able to use his eyes in order to select and assign the robotic arms to the manipulators without removing the hands of the manipulators.
Obviously, the above description of a mode of application of the innovative principles of the present invention is provided by way of example and, therefore, should not be considered as limiting the scope of the rights claimed in this document. For example, the console 15 forms a remote operating surgical station for the robotic system that can be in the same room or at a certain distance, also employing a connection via geographic networks or the like. The main control console is, in fact, a remote control device 20, which also allows a patient to be operated outside the operating room and in any location, as long as communication time delays are limited.
The remote surgical system will be suitable for any type of laparoscopic or similar operation. Obviously in this case the term "surgeon" must be understood as meaning anyone who controls the robotic system through the console.
It should be noted that, as can now be easily imagined by one skilled in the art, the system according to the invention is modular and can, for example, be configured to use a larger number of robots (for example, up to five robots) and also one or two control consoles.
权利要求:
Claims (12)
[0001]
1. Robotic surgical system (10) comprising at least one robotic arm (11) that acts under the control of a control console (12) that is intended for the surgeon characterized by the fact that: a) the console (12) comprises a eye tracking system (21) for detecting the direction of the surgeon's gaze and for entering commands depending on the direction of the gaze detected, b) the console (12) comprises a screen (22) with at least one viewing area of the operating field and a human-machine interface that displays the selection areas (31) that are associated with the commands on the screen, c) the tracking system estimating the direction of the surgeon's gaze and making the selection of the commands associated with one of the selection areas (31 ) when it detects a direction of the eye that falls within that area; d) the console comprises manipulators (17, 18) for operating the robotic arms and; e) commands associated with the selection areas (31) comprise commands to assign the manipulators to the robotic arms.
[0002]
2. System, according to claim 1, characterized by the fact that the console (12) comprises a screen (22) with at least one zone (23) for viewing the operating field and that the tracking system (21) generates a command to disable the movement of at least one robotic arm (11) when the direction of the gaze falls outside the screen, or at least the referred area (23) of the screen is detected.
[0003]
3. System, according to claim 1, characterized by the fact that the screen (22) is part of a system for three-dimensional visualization of the operating field.
[0004]
4. System according to claim 1, characterized by the fact that the eye tracking system comprises at least one telecamera (32, 33) to record the image of at least the eyes of the surgeon and devices (21) to calculate the direction of the look, depending on the image performed.
[0005]
5. System, according to claim 1, characterized by the fact that it comprises a system (21) for detecting the distance between the surgeon's eyes and a screen (22) showing an image of the operative field, this distance detection system performing a variation in the magnification and / or position of the image of the operative field shown on the screen (22), depending on a variation in the detected distance.
[0006]
6. System, according to claim 1, characterized by the fact that the console (12) comprises a screen (22) with at least one zone (23) for viewing the operating field and where the tracking system (21) generates a command that allows the movement of at least one robotic arm (11), when a direction of the eye falls on the screen or at least the said zone (23) of the screen is detected, this command being combined with a manual command to confirm the permission activation of the robotic arm movement.
[0007]
7. System according to claim 1, characterized by the fact that the console (12) comprises a screen (22) with at least one zone (23) for viewing the operating field recorded by a telecamera (14), preferably , an endoscopic telecamera, which is moved by a robotic arm (11), the commands, depending on the direction of the detected look, comprising commands to move the robotic arm, in order to vary the viewing structure of the telecamera.
[0008]
8. Method for controlling a robotic surgical system comprising at least one robotic arm (11) that acts under the control of a control console (12) for the surgeon, characterized by the fact that the direction of the surgeon's gaze is detected and surgical system functions are controlled depending on the direction of the detected gaze and where a new robotic arm is chosen from among the various robotic arms being assigned to a control manipulator by the selection that is made by the surgeon's gaze, of an image of the new arm shown on a screen and dragging this image to a position on the screen, showing the image of the robot assigned to this manipulator.
[0009]
9. Method, according to claim 8, characterized by the fact that the images of the operative field are shown in at least one area of the screen and the movement of at least one robotic arm is avoided if, in the detection of the direction, the gaze do not fall on the screen or at least on said zone of the screen and / or the movement of at least one robotic arm is enabled when, in detecting the direction, the gaze falls on the screen or at least on said zone of the screen and, preferably , if an additional manual command is also sent confirming the activation.
[0010]
10. Method, according to claim 8, characterized by the fact that the selection areas associated with the commands for the surgical system are displayed on a screen and, when a direction of the eye is detected falling into one of the selected areas, the command associated with that area will be selected and, preferably, the selected command will also be confirmed through the operation of an additional confirmation device.
[0011]
11. Method, according to claim 8, characterized by the fact that the image of the operative field is shown on a screen, the distance of the surgeon's eyes from the screen is detected and the magnification and / or the position of the image vary depending on the detected distance and / or is signaled if the detected distance is within a predefined working space of the distance detection system and / or if it is within a predefined three-dimensional image of the operating field image.
[0012]
12. Method according to claim 8, characterized by the fact that an image of the operative field recorded by a telecamera, preferably an endoscopic telecamera, moved by a robotic arm (11) is shown on a screen and the robotic arms vary at viewing structure of the camera depending on the detection of the look direction in the screen image.
类似技术:
公开号 | 公开日 | 专利标题
BR112012025516B1|2020-11-17|robotic surgical system with improved control
US20190047154A1|2019-02-14|Synthetic representation of a surgical robot
KR101772958B1|2017-08-31|Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument
KR20160008172A|2016-01-21|Surgical equipment control input visualization field
US20160180046A1|2016-06-23|Device for intermediate-free centralised control of remote medical apparatuses, with or without contact
US20210088807A1|2021-03-25|Systems and methods for three-dimensional visualization during robotic surgery
WO2015023513A1|2015-02-19|Endoscope control system
CN110226967A|2019-09-13|For remotely operating the structural adjustment system and method for medical system
JP2001275931A|2001-10-09|Medical treatment system
US20200015655A1|2020-01-16|Medical observation apparatus and observation visual field correction method
JP5800609B2|2015-10-28|Medical master-slave manipulator
JP2020048706A|2020-04-02|Surgery system and display method
JP2020048708A|2020-04-02|Surgery system and display method
US20210121245A1|2021-04-29|Surgeon interfaces using augmented reality
US10921897B2|2021-02-16|System and method for assisting operator engagement with input devices
JP2021062216A|2021-04-22|Surgical system and display method
JP2020096923A|2020-06-25|Surgery system
WO2021191598A1|2021-09-30|Virtual console for controlling a surgical robot
CN112107276A|2020-12-22|Medical operating device and method for controlling an operating device
同族专利:
公开号 | 公开日
EP3395251A1|2018-10-31|
US20190231456A1|2019-08-01|
EP2555703A1|2013-02-13|
US11224489B2|2022-01-18|
RU2727304C2|2020-07-21|
CA2795216C|2019-07-30|
EP2555703B1|2018-07-25|
ITMI20100579A1|2011-10-08|
CA2795216A1|2011-10-13|
US9360934B2|2016-06-07|
CN105852972A|2016-08-17|
BR112012025516A2|2019-10-08|
RU2015145035A3|2019-04-26|
US20130030571A1|2013-01-31|
CN102958464B|2016-03-02|
IT1401669B1|2013-08-02|
US20160249992A1|2016-09-01|
BR122014005569B1|2020-10-20|
BR122014005569A2|2019-10-29|
US10251713B2|2019-04-09|
RU2569699C2|2015-11-27|
RU2012142510A|2014-05-20|
ES2692695T3|2018-12-04|
RU2015145035A|2019-01-10|
WO2011125007A1|2011-10-13|
CN102958464A|2013-03-06|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US6405072B1|1991-01-28|2002-06-11|Sherwood Services Ag|Apparatus and method for determining a location of an anatomical target with reference to a medical apparatus|
JP3492697B2|1994-10-07|2004-02-03|セントルイスユニバーシティー|Surgical guidance device with reference and localization frame|
US6847336B1|1996-10-02|2005-01-25|Jerome H. Lemelson|Selectively controllable heads-up display system|
US7228165B1|2000-06-26|2007-06-05|Boston Scientific Scimed, Inc.|Apparatus and method for performing a tissue resection procedure|
DE10226539A1|2002-06-14|2004-01-08|Leica Microsystems Ag|Voice control for surgical microscopes|
GB0222265D0|2002-09-25|2002-10-30|Imp College Innovations Ltd|Control of robotic manipulation|
US7899512B2|2004-03-22|2011-03-01|Vanderbilt University|System and method for surgical instrument disablement via image-guided position feedback|
US20060074307A1|2004-05-10|2006-04-06|Tatsuo Igarashi|Body cavity diagnostic system|
US7501995B2|2004-11-24|2009-03-10|General Electric Company|System and method for presentation of enterprise, clinical, and decision support information utilizing eye tracking navigation|
JP2008529707A|2005-02-18|2008-08-07|コーニンクレッカフィリップスエレクトロニクスエヌヴィ|Automatic control of medical equipment|
RU2295927C1|2005-07-11|2007-03-27|Государственное Образовательное Учреждение Высшего Профессионального Образования "Дальневосточный Государственный Медицинский Университет" Министерства Здравоохранения Российской Федерации ГОУ ВПО ДВГМУ МЗ РФ|Robotic system for fulfilling reposition of osseous fragments at fractures of osseous skeleton|
US8079950B2|2005-09-29|2011-12-20|Intuitive Surgical Operations, Inc.|Autofocus and/or autoscaling in telesurgery|
US8577538B2|2006-07-14|2013-11-05|Irobot Corporation|Method and system for controlling a remote vehicle|
EP1815950A1|2006-02-03|2007-08-08|The European Atomic Energy Community , represented by the European Commission|Robotic surgical system for performing minimally invasive medical procedures|
EP1815949A1|2006-02-03|2007-08-08|The European Atomic Energy Community , represented by the European Commission|Medical robotic system with manipulator arm of the cylindrical coordinate type|
DE102006011233B4|2006-03-10|2011-04-28|Siemens Ag|Method and device for optimizing the image representation on an imaging device|
US7591558B2|2006-05-31|2009-09-22|Sony Ericsson Mobile Communications Ab|Display based on eye information|
EP1915963A1|2006-10-25|2008-04-30|The European Atomic Energy Community , represented by the European Commission|Force estimation for a minimally invasive robotic surgery system|
MY142560A|2007-01-18|2010-12-15|Cimb Group Sdn Bhd|A bond analysis system|
US8808164B2|2008-03-28|2014-08-19|Intuitive Surgical Operations, Inc.|Controlling a robotic surgical tool with a display monitor|
US8155479B2|2008-03-28|2012-04-10|Intuitive Surgical Operations Inc.|Automated panning and digital zooming for robotic surgical systems|
US20090307601A1|2008-06-05|2009-12-10|Ibm Corporation|Method for reflexive securing of computer displays|
KR100998182B1|2008-08-21|2010-12-03|미래컴퍼니|3D display system of surgical robot and control method thereof|
US9439736B2|2009-07-22|2016-09-13|St. Jude Medical, Atrial Fibrillation Division, Inc.|System and method for controlling a remote medical device guidance system in three-dimensions using gestures|
AU2010221722A1|2009-02-06|2011-08-18|Oculis Labs, Inc.|Video-based privacy supporting system|
DE102009010263B4|2009-02-24|2011-01-20|Reiner Kunz|Method for navigating an endoscopic instrument during technical endoscopy and associated device|
US8521331B2|2009-11-13|2013-08-27|Intuitive Surgical Operations, Inc.|Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument|
IT1401669B1|2010-04-07|2013-08-02|Sofar Spa|ROBOTIC SURGERY SYSTEM WITH PERFECT CONTROL.|
US9557812B2|2010-07-23|2017-01-31|Gregory A. Maltz|Eye gaze user interface and calibration method|
WO2012153152A1|2011-05-12|2012-11-15|Imperial Innovations Limited|Medical master/slave type device for minimally invasive surgery|
US20140024889A1|2012-07-17|2014-01-23|Wilkes University|Gaze Contingent Control System for a Robotic Laparoscope Holder|
CN102842301B|2012-08-21|2015-05-20|京东方科技集团股份有限公司|Display frame adjusting device, display device and display method|
KR20140112207A|2013-03-13|2014-09-23|삼성전자주식회사|Augmented reality imaging display system and surgical robot system comprising the same|
US11087271B1|2017-03-27|2021-08-10|Amazon Technologies, Inc.|Identifying user-item interactions in an automated facility|
US11083601B1|2017-07-10|2021-08-10|Braden William Thomas Leonard|Prosthetic hand system|
US10152141B1|2017-08-18|2018-12-11|Osterhout Group, Inc.|Controller movement tracking with light emitters|US8219178B2|2007-02-16|2012-07-10|Catholic Healthcare West|Method and system for performing invasive medical procedures using a surgical robot|
US10231791B2|2012-06-21|2019-03-19|Globus Medical, Inc.|Infrared signal based position recognition system for use with a robot-assisted surgery|
US11253327B2|2012-06-21|2022-02-22|Globus Medical, Inc.|Systems and methods for automatically changing an end-effector on a surgical robot|
US11045267B2|2012-06-21|2021-06-29|Globus Medical, Inc.|Surgical robotic automation with tracking markers|
US10893912B2|2006-02-16|2021-01-19|Globus Medical Inc.|Surgical tool systems and methods|
US10136954B2|2012-06-21|2018-11-27|Globus Medical, Inc.|Surgical tool systems and method|
EP1915963A1|2006-10-25|2008-04-30|The European Atomic Energy Community , represented by the European Commission|Force estimation for a minimally invasive robotic surgery system|
US10357184B2|2012-06-21|2019-07-23|Globus Medical, Inc.|Surgical tool systems and method|
US11116576B2|2012-06-21|2021-09-14|Globus Medical Inc.|Dynamic reference arrays and methods of use|
US10350013B2|2012-06-21|2019-07-16|Globus Medical, Inc.|Surgical tool systems and methods|
US8423182B2|2009-03-09|2013-04-16|Intuitive Surgical Operations, Inc.|Adaptable integrated energy control system for electrosurgical tools in robotic surgical systems|
IT1401669B1|2010-04-07|2013-08-02|Sofar Spa|ROBOTIC SURGERY SYSTEM WITH PERFECT CONTROL.|
US8672837B2|2010-06-24|2014-03-18|Hansen Medical, Inc.|Methods and devices for controlling a shapeable medical device|
EP2627278B1|2010-10-11|2015-03-25|Ecole Polytechnique Fédérale de Lausanne |Mechanical manipulator for surgical instruments|
WO2012131660A1|2011-04-01|2012-10-04|Ecole Polytechnique Federale De Lausanne |Robotic system for spinal and other surgeries|
JP6309447B2|2011-05-31|2018-04-11|インテュイティブ サージカル オペレーションズ, インコーポレイテッド|Active control of end effectors of surgical instruments by robots|
JP5715304B2|2011-07-27|2015-05-07|エコール ポリテクニーク フェデラル デ ローザンヌ (イーピーエフエル)|Mechanical remote control device for remote control|
JP6141289B2|2011-10-21|2017-06-07|インテュイティブ サージカル オペレーションズ, インコーポレイテッド|Gripping force control for robotic surgical instrument end effector|
US9503713B2|2011-11-02|2016-11-22|Intuitive Surgical Operations, Inc.|Method and system for stereo gaze tracking|
EP2863827A4|2012-06-21|2016-04-20|Globus Medical Inc|Surgical robot platform|
US20140024889A1|2012-07-17|2014-01-23|Wilkes University|Gaze Contingent Control System for a Robotic Laparoscope Holder|
DE102012212780A1|2012-07-20|2014-01-23|Siemens Aktiengesellschaft|Operating device for inputting operating commands for controlling a technical system|
EP2895098A4|2012-09-17|2016-05-25|Intuitive Surgical Operations|Methods and systems for assigning input devices to teleoperated surgical instrument functions|
US10631939B2|2012-11-02|2020-04-28|Intuitive Surgical Operations, Inc.|Systems and methods for mapping flux supply paths|
US10864048B2|2012-11-02|2020-12-15|Intuitive Surgical Operations, Inc.|Flux disambiguation for teleoperated surgical systems|
WO2014093367A1|2012-12-10|2014-06-19|Intuitive Surgical Operations, Inc.|Collision avoidance during controlled movement of image capturing device and manipulatable device movable arms|
US20140160004A1|2012-12-11|2014-06-12|Biosense Webster , Ltd.|Use of physician eye tracking during a procedure|
US9167147B2|2013-02-15|2015-10-20|International Business Machines Corporation|Mobile device field of view region determination|
US9041645B2|2013-02-15|2015-05-26|International Business Machines Corporation|Transparent display field of view region determination|
US9057600B2|2013-03-13|2015-06-16|Hansen Medical, Inc.|Reducing incremental measurement sensor error|
WO2014151621A1|2013-03-15|2014-09-25|Sri International|Hyperdexterous surgical system|
US9014851B2|2013-03-15|2015-04-21|Hansen Medical, Inc.|Systems and methods for tracking robotically controlled medical instruments|
US11020016B2|2013-05-30|2021-06-01|Auris Health, Inc.|System and method for displaying anatomy and devices on a movable display|
US9283048B2|2013-10-04|2016-03-15|KB Medical SA|Apparatus and systems for precise guidance of surgical tools|
US10314463B2|2014-10-24|2019-06-11|Auris Health, Inc.|Automated endoscope calibration|
FR3015222B1|2013-12-24|2019-11-22|General Electric Company|METHOD OF PROCESSING MEDICAL IMAGES BY FOLLOWING REGARD|
US10039605B2|2014-02-11|2018-08-07|Globus Medical, Inc.|Sterile handle for controlling a robotic surgical system from a sterile field|
US10166061B2|2014-03-17|2019-01-01|Intuitive Surgical Operations, Inc.|Teleoperated surgical system equipment with user interface|
CN106456148B|2014-03-19|2020-06-12|直观外科手术操作公司|Medical devices, systems, and methods using eye gaze tracking|
CN106659541B|2014-03-19|2019-08-16|直观外科手术操作公司|Integrated eyeball stares medical device, the system and method that tracking is used for stereoscopic viewer|
EP3243476B1|2014-03-24|2019-11-06|Auris Health, Inc.|Systems and devices for catheter driving instinctiveness|
CN106659537B|2014-04-24|2019-06-11|Kb医疗公司|The surgical instrument holder used in conjunction with robotic surgical system|
US10613627B2|2014-05-12|2020-04-07|Immersion Corporation|Systems and methods for providing haptic feedback for remote interactions|
CN104090659B|2014-07-08|2017-04-05|重庆金瓯科技发展有限责任公司|Operating pointer based on eye image and Eye-controlling focus indicates control device|
CN104055478B|2014-07-08|2016-02-03|金纯�|Based on the medical endoscope control system that Eye-controlling focus controls|
WO2016008880A1|2014-07-14|2016-01-21|KB Medical SA|Anti-skid surgical instrument for use in preparing holes in bone tissue|
US10179407B2|2014-11-16|2019-01-15|Robologics Ltd.|Dynamic multi-sensor and multi-robot interface system|
US10864049B2|2014-12-19|2020-12-15|Distalmotion Sa|Docking system for mechanical telemanipulator|
WO2016097868A1|2014-12-19|2016-06-23|Distalmotion Sa|Reusable surgical instrument for minimally invasive procedures|
US11039820B2|2014-12-19|2021-06-22|Distalmotion Sa|Sterile interface for articulated surgical instruments|
EP3232974B1|2014-12-19|2018-10-24|DistalMotion SA|Articulated handle for mechanical telemanipulator|
US10864052B2|2014-12-19|2020-12-15|Distalmotion Sa|Surgical instrument with articulated end-effector|
US10013808B2|2015-02-03|2018-07-03|Globus Medical, Inc.|Surgeon head-mounted display apparatuses|
US10555782B2|2015-02-18|2020-02-11|Globus Medical, Inc.|Systems and methods for performing minimally invasive spinal surgery with a robotic surgical system using a percutaneous technique|
EP3280343A1|2015-04-09|2018-02-14|DistalMotion SA|Mechanical teleoperated device for remote manipulation|
US20160306420A1|2015-04-17|2016-10-20|Charles Arthur Hill, III|Method for Controlling a Surgical Camera through Natural Head Movements|
WO2017015167A1|2015-07-17|2017-01-26|Deka Products Limited Partnership|Robotic surgery system, mithod, and appratus|
US10058394B2|2015-07-31|2018-08-28|Globus Medical, Inc.|Robot arm and methods of use|
WO2017025486A1|2015-08-07|2017-02-16|SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH|Method and system to control a workflow and method and system for providing a set of task-specific control parameters|
US10080615B2|2015-08-12|2018-09-25|Globus Medical, Inc.|Devices and methods for temporary mounting of parts to bone|
EP3310286A1|2015-08-13|2018-04-25|Siemens Healthcare GmbH|Device and method for controlling a system comprising an imaging modality|
US10786272B2|2015-08-28|2020-09-29|Distalmotion Sa|Surgical instrument with increased actuation force|
JP6525150B2|2015-08-31|2019-06-05|インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation|Method for generating control signals for use with a telepresence robot, telepresence system and computer program|
US10034716B2|2015-09-14|2018-07-31|Globus Medical, Inc.|Surgical robotic systems and methods thereof|
US10045825B2|2015-09-25|2018-08-14|Karl Storz Imaging, Inc.|Partial facial recognition and gaze detection for a medical system|
US9771092B2|2015-10-13|2017-09-26|Globus Medical, Inc.|Stabilizer wheel assembly and methods of use|
ITUB20155830A1|2015-11-23|2017-05-23|R A W Srl|"NAVIGATION, TRACKING AND GUIDE SYSTEM FOR THE POSITIONING OF OPERATOR INSTRUMENTS"|
US10143526B2|2015-11-30|2018-12-04|Auris Health, Inc.|Robot-assisted driving systems and methods|
CN108472090B|2015-12-29|2021-06-18|皇家飞利浦有限公司|System, control unit and method for controlling a surgical robot|
US10646289B2|2015-12-29|2020-05-12|Koninklijke Philips N.V.|System, controller and method using virtual reality device for robotic surgery|
US10117632B2|2016-02-03|2018-11-06|Globus Medical, Inc.|Portable medical imaging system with beam scanning collimator|
US10448910B2|2016-02-03|2019-10-22|Globus Medical, Inc.|Portable medical imaging system|
US11058378B2|2016-02-03|2021-07-13|Globus Medical, Inc.|Portable medical imaging system|
US10842453B2|2016-02-03|2020-11-24|Globus Medical, Inc.|Portable medical imaging system|
US10866119B2|2016-03-14|2020-12-15|Globus Medical, Inc.|Metal detector for detecting insertion of a surgical device into a hollow tube|
AU2017236893A1|2016-03-21|2018-09-06|Washington University|Virtual reality or augmented reality visualization of 3D medical images|
US10733275B1|2016-04-01|2020-08-04|Massachusetts Mutual Life Insurance Company|Access control through head imaging and biometric authentication|
US10956544B1|2016-04-01|2021-03-23|Massachusetts Mutual Life Insurance Company|Access control through head imaging and biometric authentication|
US11267125B2|2016-04-08|2022-03-08|Delta Electronics, Inc.|Mechanism-parameter-calibration method for robotic arm system|
US10551923B2|2016-05-04|2020-02-04|Worcester Polytechnic Institute|Haptic glove as a wearable force feedback user interface|
AU2017275482A1|2016-06-03|2018-11-15|Covidien Lp|Systems, methods, and computer-readable storage media for controlling aspects of a robotic surgical device and viewer adaptive stereoscopic display|
US10568703B2|2016-09-21|2020-02-25|Verb Surgical Inc.|User arm support for use in a robotic surgical system|
EP3515324A4|2016-09-23|2020-05-20|U.S. Patent Innovations LLC|Robotic surgical system|
US9931025B1|2016-09-30|2018-04-03|Auris Surgical Robotics, Inc.|Automated calibration of endoscopes with pull wires|
US10813710B2|2017-03-02|2020-10-27|KindHeart, Inc.|Telerobotic surgery system using minimally invasive surgical tool with variable force scaling and feedback and relayed communications between remote surgeon and surgery station|
WO2018164909A1|2017-03-10|2018-09-13|Biomet Manufacturing, Llc|Augmented reality supported knee surgery|
TW201902431A|2017-04-28|2019-01-16|美商尼奧西斯股份有限公司|Method for guiding oral cavity surface program and related system|
US11058503B2|2017-05-11|2021-07-13|Distalmotion Sa|Translational instrument interface for surgical robot and surgical robot systems comprising the same|
EP3629981A4|2017-05-25|2021-04-07|Covidien LP|Systems and methods for detection of objects within a field of view of an image capture device|
US10675094B2|2017-07-21|2020-06-09|Globus Medical Inc.|Robot surgical platform|
EP3658059A4|2017-07-27|2020-07-15|Intuitive Surgical Operations Inc.|Association processes and related systems for manipulators|
WO2019023020A1|2017-07-27|2019-01-31|Intuitive Surgical Operations, Inc.|Association processes and related systems for manipulators|
EP3336848A1|2017-08-15|2018-06-20|Siemens Healthcare GmbH|Method for operating a medical imaging device and medical imaging device|
US20200261160A1|2017-09-05|2020-08-20|Covidien Lp|Robotic surgical systems and methods and computer-readable media for controlling them|
US11058493B2|2017-10-13|2021-07-13|Auris Health, Inc.|Robotic system configured for navigation path tracing|
JP6778242B2|2017-11-09|2020-10-28|グローバス メディカル インコーポレイティッド|Surgical robot systems for bending surgical rods, and related methods and equipment|
US11134862B2|2017-11-10|2021-10-05|Globus Medical, Inc.|Methods of selecting surgical implants and related devices|
AU2018380139A1|2017-12-06|2020-05-21|Auris Health, Inc.|Systems and methods to correct for uncommanded instrument roll|
US11071595B2|2017-12-14|2021-07-27|Verb Surgical Inc.|Multi-panel graphical user interface for a robotic surgical system|
CA3079816A1|2017-12-14|2019-06-20|Verb Surgical Inc.|Multi-panel graphical user interface for a robotic surgical system|
EP3684283A4|2017-12-18|2021-07-14|Auris Health, Inc.|Methods and systems for instrument tracking and navigation within luminal networks|
CN108065904A|2018-01-02|2018-05-25|京东方科技集团股份有限公司|Endoscopic system and its control method|
EP3737326A4|2018-01-10|2021-12-29|Covidien LP|Determining positions and conditions of tools of a robotic surgical system utilizing computer vision|
US20210030498A1|2018-02-02|2021-02-04|Covidien Lp|Robotic surgical systems with user engagement monitoring|
WO2019155383A1|2018-02-07|2019-08-15|Distalmotion Sa|Surgical robot systems comprising robotic telemanipulators and integrated laparoscopy|
KR20200122337A|2018-02-13|2020-10-27|아우리스 헬스, 인코포레이티드|Systems and methods for driving medical devices|
US20190254753A1|2018-02-19|2019-08-22|Globus Medical, Inc.|Augmented reality navigation systems for use with robotic surgical systems and methods of their use|
CN110891469A|2018-03-28|2020-03-17|奥瑞斯健康公司|System and method for registration of positioning sensors|
JP2021519154A|2018-03-28|2021-08-10|オーリス ヘルス インコーポレイテッド|Systems and methods for indicating the estimated position of the instrument|
US10573023B2|2018-04-09|2020-02-25|Globus Medical, Inc.|Predictive visualization of medical imaging scanner component movement|
DE102018110644A1|2018-05-03|2019-11-07|Carl Zeiss Meditec Ag|Digital microscope and digital microscopy method|
WO2019222395A1|2018-05-16|2019-11-21|Intuitive Surgical Operations, Inc.|System and method for hybrid control using eye tracking|
CN110831486A|2018-05-30|2020-02-21|奥瑞斯健康公司|System and method for location sensor based branch prediction|
EP3801348A1|2018-05-31|2021-04-14|Auris Health, Inc.|Image-based airway analysis and mapping|
KR20210018858A|2018-05-31|2021-02-18|아우리스 헬스, 인코포레이티드|Route-based navigation of coronary networks|
US10895757B2|2018-07-03|2021-01-19|Verb Surgical Inc.|Systems and methods for three-dimensional visualization during robotic surgery|
JP6770025B2|2018-07-12|2020-10-14|ファナック株式会社|robot|
JP6856594B2|2018-09-25|2021-04-07|株式会社メディカロイド|Surgical system and display method|
JP6898285B2|2018-09-25|2021-07-07|株式会社メディカロイド|Surgical system and display method|
US10765487B2|2018-09-28|2020-09-08|Auris Health, Inc.|Systems and methods for docking medical instruments|
EP3671305A1|2018-12-18|2020-06-24|Eberhard Karls Universität Tübingen|Exoscope system and use of such an exoscope system|
US11204640B2|2019-05-17|2021-12-21|Verb Surgical Inc.|Methods for determining if teleoperation should be disengaged based on the user's gaze|
US20200360096A1|2019-05-17|2020-11-19|Verb Surgical Inc.|Interlock mechanisms to disengage and engage a teleoperation mode|
US11045179B2|2019-05-20|2021-06-29|Global Medical Inc|Robot-mounted retractor system|
WO2020243192A1|2019-05-29|2020-12-03|Intuitive Surgical Operations, Inc.|Operating mode control systems and methods for a computer-assisted surgical system|
WO2021038469A1|2019-08-30|2021-03-04|Auris Health, Inc.|Systems and methods for weight-based registration of location sensors|
US11147633B2|2019-08-30|2021-10-19|Auris Health, Inc.|Instrument image reliability systems and methods|
US11207150B2|2020-02-19|2021-12-28|Globus Medical, Inc.|Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment|
CN111281649B|2020-03-03|2021-08-13|西安交通大学|Ophthalmic surgery robot system and control method thereof|
US20210282878A1|2020-03-11|2021-09-16|Verb Surgical Inc.|Surgeon disengagement detection during termination of teleoperation|
US11253216B2|2020-04-28|2022-02-22|Globus Medical Inc.|Fixtures for fluoroscopic imaging systems and related navigation systems and methods|
US11153555B1|2020-05-08|2021-10-19|Globus Medical Inc.|Extended reality headset camera system for computer assisted navigation in surgery|
CN111657842B|2020-06-02|2021-08-31|首都医科大学附属北京地坛医院|Probe control method and probe control device|
法律状态:
2019-12-10| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2019-12-10| B25A| Requested transfer of rights approved|Owner name: TRANSENTERIX ITALIA, S.R.L. (IT) |
2019-12-24| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-07-28| B15K| Others concerning applications: alteration of classification|Free format text: AS CLASSIFICACOES ANTERIORES ERAM: A61B 19/00 , A61B 5/11 , A61B 3/113 , G06F 3/01 Ipc: A61B 17/00 (2006.01), A61B 34/30 (2016.01), G06F 3 |
2020-08-04| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2020-11-17| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 01/04/2011, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
ITMI2010A000579A|IT1401669B1|2010-04-07|2010-04-07|ROBOTIC SURGERY SYSTEM WITH PERFECT CONTROL.|
ITMI2010A000579|2010-04-07|
PCT/IB2011/051397|WO2011125007A1|2010-04-07|2011-04-01|Robotized surgery system with improved control|BR122014005569-2A| BR122014005569B1|2010-04-07|2011-04-01|robotic surgical system with improved control|
[返回顶部]