![]() METHOD, SYSTEM AND COMPUTER PROGRAM OF PROXIMITY DETECTION (Machine-translation by Google Translate,
专利摘要:
Method, system and computer program for proximity detection. A method of proximity detection between a device and an obstacle, comprising: establishing an initial geometric description (A (t = 0)) of a device under control (12, 22) as a set of primitive forms; defining (32) a volume to be monitored (V1, V2) around said device under control (12, 22); obtaining (31) a set of points in space (N1) by means of a 3D sensor (13, 23); performing a first filtering (35) of said set of points in the space (N1) to eliminate from said set of points (N1) the points that lie outside said volume to be monitored (V1, V2), obtaining a subset of points (N2); monitor (33, 34) the geometry of the device under control (12, 22), updating the geometric description of the device under control (A (t), As (t)) represented as a set of primitive forms; performing a second filtering (36) of said subset of points (N2) to eliminate the points remaining within said updated geometric description (A (t)) of the device under control (11, 22), obtaining a set of points in the space B (t) representing an obstacle (14, 24) within said volume to be monitored (V1, V2); calculating a distance (D1, D2) between said updated geometric description (A (t), As (t)) of the device under control (12, 22) and said set of points in the space B (t) representing said obstacle (14, 24). System and computer program. (Machine-translation by Google Translate, not legally binding) 公开号:ES2681123A1 申请号:ES201730325 申请日:2017-03-10 公开日:2018-09-11 发明作者:Carlos TUBIO OTERO;Loreto SUSPERREGI ZABALO;Ane FERNANDEZ GONZALEZ;Ander ANSUATEGI COBO;Iñaki MAURTUA ORMAECHEA 申请人:Fundacion Tekniker; IPC主号:
专利说明:
METHOD, SYSTEM AND COMPUTER PROGRAM OF PROXIMITY DETECTION FIELD OF THE INVENTION The present invention belongs to the field of security methods and systems. More specifically, the invention belongs to the field of proximity detection methods and systems to avoid collisions between people or objects. BACKGROUND OF THE INVENTION Avoiding the collision between a moving object and any element (including people or animals) that may cross your path is a need that is currently addressed in different ways. In the case of land vehicles, which normally circulate at a relatively high speed, these vehicles often incorporate on-board collision prevention systems based on GPS systems and sensors, such as radars or cameras, which allow the location of both the vehicle itself and the vehicle. of potential obstacles. For example, US Patent 8924139 B2 describes one of these collision prevention systems. In other types of moving objects it is not feasible or, due for example to the relatively low speed that these objects can reach, it is not worthwhile to incorporate detection systems such as that described in US 8924139 B2. Such is the case, for example, of small vehicles for the transport of people or things in specific areas, such as warehouses, airports, train stations, tourist routes, etc .; or of cargo transport equipment, such as manipulators, mechatronic elements, cranes, normally remotely operated or programmed to follow a trajectory, such as manipulative robots; or other vehicles dedicated to specific sectors such as agriculture (tractors, etc.) or construction (excavators, etc.). US patent application 2006/0287829 A1 describes a method to detect potential collisions between a cargo handling vehicle and objects that are around it, based on having a sensor system in the vehicle for detecting objects in its surroundings and a Alert system to detect alarm signals if an object is detected at a certain distance. However, this method requires the installation of distance sensors (ultrasound, radar, laser, and / or electromagnetic detectors) in the periphery of the vehicle to be monitored, which implies selecting a priori the risk points of the vehicle (for example, airplane), when in reality the risk points depend on the type of situation that may occur. In turn, US patent application 2015/0254985 A1 describes a system to avoid collisions in underground mines. The system is based on a thermal vision system for capturing images in real time for the identification and position of objectives, from which its direction and speed are also evaluated. A collision avoidance software is also included, configured to detect the proximity of targets, determine its threat and avoid collision based on the threat determination. However, this system needs to generate an infrastructure based on devices outside the mobile element. In addition, the thermal vision system requires training for the identification of predefined objects based on the thermal footprint. Methods and devices are also known to provide information to its bearer about the presence of nearby objects and the distance they are found. For example, patent application US 2011/0025492 A1 describes an apparatus that informs its wearer about the presence of a nearby object by tactile vibrations after the detection of the nearby object by means of a proximity detector. DESCRIPTION OF THE INVENTION The present invention provides a method, system and proximity detection software that solves the drawbacks of previous proposals. The method, system and computer program are based on monitoring or monitoring a volume around a device under control, by calculating a distance between the device under control and an obstacle. The minimum distance between both elements is preferably calculated. The risk of collision between the device under control and an obstacle located within that monitored volume can be determined. The proximity detection method is implemented by a software module (computer program) that can be configured and integrated into any application in which there is an interest in controlling a volume around a device under control, either a mobile device or a static device In the context of the present disclosure, "device under control" means any device or equipment, whether static or moving, including devices that can move either by themselves or transported on other equipment or pushed or driven by external means to the device. The invention has special application in relation to devices in motion or with the ability to move, either by themselves or transported or pushed by other devices or equipment. The device (or a device in which the device is located or in which the device is transported) can be driven by an operator who handles it locally, or operated remotely, or it can move autonomously, for example but in a way not limited, because it is programmed to follow a trajectory or to autonomously search for a trajectory up to a certain objective. The operator is a person who controls the movement of the device (or the equipment that transports it, if this is the case), either by controlling the device (or equipment that transports it) as such or by controlling external means that drive or move the device (or equipment that transports it). Equipment that can transport a device under control includes, but is not limited to, equipment that can carry a load, whether local or remotely operated, or programmed to follow or search for a path, such as a manipulator, a mechatronic element, a crane, a robot, etc .; vehicles for the transport of people or things, for example in specific areas, such as warehouses, airports, train stations, tourist routes, etc .; or vehicles dedicated to specific sectors such as agriculture (tractors, etc.) or construction (excavators, etc.). These devices can themselves be the device under control. In the event that the method of the invention is used to monitor the risk of collision between a person or animal and a potential obstacle, the term “device under control” also includes the person or animal whose risk of collision is being monitored (for example, in the case of blind people, young children or people with a disorder that prevent them from assessing collision risks). In the context of this disclosure, "obstacle" means any object, person or animal that is within a volume surrounding the device under control. That is to say, by "obstacle" not only those objects, people or animals are included that, being within said volume, interpose in the trajectory of the device under control, but any object, person or animal that, being within said volume , are considered potential collision threats. The obstacle may be static or moving. In the context of the present disclosure, "3D sensor" means any sensor capable of offering three-dimensional information of the environment that can be represented in the form of a point cloud. 3D sensors are based on different technologies, such as vision, laser, radar, among others. The point cloud represents the information obtained by the sensor in volume, associating to each point X, Y, Z a value, that allows to represent the obstacles detected by the sensor. As an example, without limitation, you can use the Velodyne's PUCK ™ (VLP-16) LiDAR, which offers distance information to objects in real time with a range of 100 m, 360º horizontal field of view radius , and + -15º vertically. The 3D sensor (hereinafter, the sensor) can be placed either in the environment or surroundings of the device under control, or placed on or in the device itself under control. When the sensor is located in the device itself under control, if the device under control is moving, the sensor moves at the same time as the device under control, so that a previously defined volume is always monitored around the device under control. control. That is, the monitored volume moves with the device under control. Alternatively, the sensor can be placed outside the device under control. In this case, the volume to be monitored can be associated with the sensor, in which case a fixed volume is monitored, or the monitored volume can be associated and moved with the device under control. The method and computer program compute the collision risk that exists between the device under control and an obstacle, from various sources of information. These sources of information are, at least: the geometry of the device under control at a given time A (t); and the geometry of the obstacle (s) detected by the sensor within a volume at a given time B (t). Note that both the geometry of the device under control and that of the obstacle (s) detected within the monitored volume vary or may vary over time, since both the device under control and the obstacle (s) may vary in position over time (i.e. , may be on the move). The geometry of the device under control can be fixed or variable; If, for example, it is a robotic manipulator of 7 degrees of freedom, the geometry changes depending on the state of each of the degrees of freedom, changing the geometric configuration when making a movement from one point to another. The same is true, for example, in the case of a vehicle carrying a manipulator that in turn carries a load: in this case, the assembly formed by the manipulator and the load can be considered the “device under control” (the set of manipulator and load determine the geometric configuration of the device.Or in another example, a crane that can carry different loads: the load determines the geometric configuration of the device.To assess the risk of collision, the method, system and computer program compute a distance, preferably minimal, between said geometries A (t) and B (t). The description of the geometry of the device under control A (t) is defined as a set of primitive forms, such as, but not limited to, boxes, cylinders, etc. This simplified representation of the geometry of the device under control significantly reduces the calculation time of the minimum distance. Because of this, the method, system and computer program are capable of reacting to highly dynamic environments. With respect to the geometry of the obstacle (s) detected by the sensor within a volume V B (t), the geometry within volume V is represented as a set of voxels occupied by the elements (obstacles) within this volume. When an element is within the volume and is therefore considered an obstacle, the voxels corresponding to the obstacle are marked as occupied. That is, each voxel is associated with a bit that indicates whether the voxel is marked or unmarked. As an expert knows, a voxel (English volumetric pixel) is the cubic unit that makes up a three-dimensional object. The voxel constitutes the minimum processable unit of a three-dimensional matrix, therefore being the equivalent of the pixel in a 2D object. In a first aspect of the invention, a method of detecting proximity between a device and an obstacle is provided, comprising the following steps: establishing an initial geometric description of a device under control as a set of primitive forms; define a volume to be monitored around said device under control; obtain a set of points in space using a 3D sensor; make a first filtering of said set of points in the space to eliminate from said set of points the points that remain outside said volume to be monitored, obtaining a subset of points; monitor the geometry of the device under control, updating the geometric description of the device under control represented as a set of primitive forms; performing a second filtering of said subset of points to eliminate the points that remain within said updated geometric description of the device under control, obtaining a set of points in the space that represent an obstacle within said volume to be monitored; calculate a distance between said updated geometric description of the device under control and said set of points in the space that represent said obstacle. In embodiments of the invention, the 3D sensor is located in said device under control or close to it. In embodiments of the invention, the 3D sensor moves at the same time that the device under control is moved or at the same time that equipment carrying or transporting to said device under control. In embodiments of the invention, the volume to be monitored is shifted at the same time as said device is moved under control or at the same time as equipment that carries or transports said device under control is moved. In embodiments of the invention, the first filtering of said set of points in space obtained by a 3D sensor also eliminates points that represent noise. In embodiments of the invention, the calculated distance between said updated geometric description of the device under control and said set of points in the space representing said obstacle is a minimum distance. In embodiments of the invention, in said monitoring step a simplified geometric description of the device under control is obtained. In some of these embodiments, the simplified geometric description of the device under control is used to calculate said distance. In embodiments of the invention, a step is also performed, from said distance, to establish a collision risk between said device under control and said obstacle, or to select a set of collision risk points located in said device under control and / or in said obstacle. In embodiments of the invention, the above steps are repeated with a certain frequency, the updated geometric description of the device under control and the set of points in the space representing said obstacle being updated with said frequency, thus recalculating said distance. In embodiments of the invention, the step of initially calibrating the relative position between the sensor, the device under control and the volume to be monitored is also performed. In a second aspect of the invention, a proximity detection system between a device and an obstacle is provided, comprising: means for establishing an initial geometric description of a device under control as a set of primitive forms; means for defining a volume to be monitored around said device under control; means for obtaining a set of points in space by means of a 3D sensor; means for performing a first filtering of said set of points in the space to eliminate from said set of points the points that remain outside said volume to be monitored, obtaining a subset of points; means to monitor the geometry of the device under control, updating the description geometric device under control represented as a set of primitive forms; means for performing a second filtering of said subset of points to eliminate the points that remain within said updated geometric description of the device under control, obtaining a set of points in the space that represent an obstacle within said volume to be monitored; means for calculating a distance between said updated geometric description of the device under control and said set of points in the space representing said obstacle. In embodiments of the invention, the system further comprises a configuration module configured to provide an interface in which to set said volume to be monitored and said initial geometric description of the device under control. In embodiments of the invention, the system further comprises memory storage means. In embodiments of the invention, the system further comprises a control interface configured to visualize the possible collision points between the device under control and the obstacle. In a third aspect of the invention, a computer program comprising computer program code instructions for performing the method described above is provided. As can be seen, the method, system and computer program provide appreciable advantages over known methods. For example, unlike US 2006/0287829 A1, whose method requires a priori to select the risk points of the vehicle to be monitored, in the present invention a 3D sensor that covers an area to be monitored around the device under control is sufficient, without the need of deploying a sensor installation. With respect to US 2015/0254985 A1, which needs to generate an infrastructure based on devices outside the mobile element and requires training for the identification of predefined objects based on a thermal footprint, this method and system establish a collision risk by analyzing and comparing clouds of processed points, without the need to generate infrastructure based on devices outside the mobile element. Additional advantages and features of the invention will be apparent from the following detailed description and will be pointed out in particular in the appended claims. BRIEF DESCRIPTION OF THE FIGURES To complement the description and in order to help a better understanding of the characteristics of the invention, according to an example of practical implementation thereof, a set of figures in which with character is accompanied as an integral part of the description Illustrative and not limiting, the following has been represented: Figures 1A and 1B schematically illustrate two possible scenarios of application of the method and computer program of the invention. Figure 2 illustrates a block diagram of the method of the invention. Figure 3 illustrates a block diagram of the method of the invention that includes a configuration module. Figure 4 illustrates an application example in which the method, system and computer program of the invention has been used to monitor an area around a robotic arm hung from a crane. A control interface shows the potential collision points on the screen at a given moment. DESCRIPTION OF AN EMBODIMENT OF THE INVENTION Figure 1A illustrates a schematic of a possible application scenario of the proximity detection method, system and computer program according to a possible embodiment of the invention. It shows schematically a device under control 12. The device under control 12 is hung or placed in a device 11, specifically a guide 11 has been shown from which the device 12 can be hung, increasing its radius of action, since that the device 12 moves (or potentially can move), changing its configuration, and in addition all of it can be moved on the axis or guide. Note that this is a typical (but not limiting) use case in the automotive industry for painting or welding parts. A 3D sensor 13 is also shown, in this case located in the device itself under control 12. Alternatively, the sensor can be placed in the environment or surroundings of the device under control, for example coupled to a device that carries or transports the device under control. , so that the sensor moves at the same time as the device is moved under control. The 3D sensor is outside the scope of the present invention. A volume V1 is also displayed under monitoring. Volume V1 represents the space within which it is desired to monitor the presence of obstacles that may pose a risk of collision with the device under control 12. In this embodiment, volume V1 has been defined as a orthohedron, but the three-dimensional form of volume V1 may be different (for example, but not limited to, a cylinder, a sphere, a prism, etc.). Potential obstacles at different instants of time within volume V1 will be detected by the sensor 13. The monitored volume V1 moves with the device under control 12. To carry out the calculation of distance (and associated collision risk) by the method and computer program, explained below, the device under control 12 is represented in said program and method by means of its geometry at a given moment A (t). This geometry is modeled or represented in said computer program as a set of primitive forms, for example as a set of prisms. In the scheme of Figure 1A, an obstacle 14 is represented as a person approaching the volume V1. The geometry B (t) of the obstacle 14 detected by the sensor is represented or modeled by the computer method and program, as a set of voxels occupied by the obstacle 14 within the volume V1 under inspection. That is, when obstacle 14 is within volume V1, the voxels corresponding to obstacle 14 are marked as occupied. When the obstacle 14, generally in motion, is partially within the volume V1 and partially outside it, as is the case of the example shown in Figure 1A, at each instant of time only the voxels within the volume V1 are marked as busy (in fact, the method and program excludes voxels outside of volume V1 even if they belong to the same obstacle, as explained in relation to the diagram in Figure 2). Therefore, the geometry B (t) varies with time, as the perception of the sensor of the obstacle varies as the sensor moves (and optionally the obstacle also moves, as the case may be), that is, as that the data related to an obstacle within the volume V1 that always accompanies the device under control vary. Figure 1A also shows an arrow representing a distance D1 which in turn indicates a risk of collision R1 between the device under control 12 and obstacle 14. That is, in the computer method and program, the risk of collision R1 is calculated. as a distance D1 between the geometry A (t) of the device under control 12 at a given instant and the geometry B (t) of the obstacle 14 detected by the sensor 13 within the selected volume V1 at a given instant. As can be seen, both the device under control 12 and the obstacle 14 are potentially mobile, that is, they are not necessarily static. Figure 1B illustrates a schematic of another possible scenario of application of the method, proximity detection computer system and program according to another embodiment of the invention. It shows a device under control 22. In this case, the device under control 22 is being transported by a cart or small transport vehicle 21, specifically the device under control 22 is placed on a mast of said vehicle 21. In the The bottom of the mast is a 3D sensor 23, which is outside the scope of the invention. The sensor 23 could be placed in another part of the vehicle 21 or of the device itself under control 22. A volume V2 is also shown under monitoring, which represents the space within which it is desired to monitor the presence of obstacles that may pose a risk of collision with the device under control 22. In this case, both the volume V2 and the device under control 22 move when the vehicle 21 moves. In this embodiment, the volume V2 has been defined as a cylinder, but the three-dimensional shape of the volume V2 may be different (for example, but not limited to, an orthohedron, a sphere, a prism, etc.). The potential obstacles at different times of time will be detected by the sensor 23. As in the case described in Figure 1A, for the calculation of collisions carried out by the method and computer program, explained below, the device 22 is represents by its geometry at a given moment A (t). This geometry is represented as a set of primitive forms (not illustrated in Figures 1A or 1B). In the scheme of Figure 1B, an obstacle 24 is represented as a column. Therefore, a static obstacle has been illustrated in this case. Note that although in this case the obstacle is static, the geometry of the obstacle B (t) detected by the sensor 23 varies over time because the sensor 23 moves along with the vehicle 21 carrying the device under control 22. As in the previous case, B (t) represents a set of voxels occupied by the obstacle within a volume V2 and detected by the sensor 23 within the volume V2 that always accompanies the device under control 22. That is, to measure that the device under control 22 moves, which detects the sensor 23 that moves with the device under control 22 is changing. For example, if the device under control 22, when traveling on the vehicle 21, approaches the obstacle 24, at each instant of time B (t) will have more occupied voxels (marked as occupied). The geometry B (t) of the obstacle 24 captured by the sensor 23 is represented as a set of voxels occupied by the obstacle 24 within the volume V2 under inspection. That is, when the device under control 22 approaches the obstacle 24, and therefore, when the volume V2 moves together with the sensor 23, the obstacle 24 can become, totally or partially, within the volume V2. On the contrary, if the movement is different, for example moving away from obstacle 24, it may cease to be within volume V2. The voxels corresponding to the obstacle 24 that remain within the volume V2 at each moment are marked as occupied and represent the geometry of what the 3D sensor 23 detects. Figure 1B also shows an arrow representing a distance D2 which in turn indicates the risk of collision R2 between the device under control 22 and the obstacle 24. The risk of collision R2 is calculated as a distance D2 between the geometry A (t ) of the device under control 22 at a given time and the geometry B (t) of the obstacle 24 at a given time (seen or captured by the sensor 23). In both the embodiment of Figure 1A and that of Figure 1B, the minimum distance between the two geometries is preferably used to calculate the risk of collision R1, R2 at each instant of time. Because the description of the geometry of the device under control 21, 22 has been defined as a set of primitive forms (ie, in a simplified form), the distance calculation time, preferably minimal, is significantly reduced. It is thus possible to react to highly dynamic environments. To execute the method, system and computer program, the relative position between the sensor 13 (or 23), the device under control 12 (or 22) and the volume V1 (or V2) is previously calibrated. The calibration is based on establishing a common reference frame for the device 12, 22 and the sensor 13, 23, so that the geometric relationship between the information coming from the sensor 13, 23 with respect to an obstacle 14, 24 and its position in space in the monitored volume V1, V2. That is to say, during the calibration the necessary transformation parameters are established to establish the position with respect to the device under control 12, 22 of a voxel indicated by the sensor 13, 23 (which may be in the device 12, 22 or outside the same). In this way, the distances between A (t) and B (t) can be calculated with respect to the same reference origin. Volume V1, V2 is fixed and is configured at startup. That is, a fixed volume V1, V2 is monitored around the device 12, 22. The volume V1, V2 can be changed for different sessions or applications, but if changed, it must be done at the beginning, for example during a configuration stage, and is set for the entire session or, for example, until a new configuration. That is, the defined volume does not change dynamically. The sampling frequency is preferably given by sensor 13. 23. That is, A (t) and B (t) are updated in time every 1 / f seconds, where f is the sampling frequency of the sensor 13, 23. In non-limiting examples, the frequency of Sampling can be 5 Hz (Hertz), 10 Hz or 20 Hz. Figure 2 illustrates a block diagram of the method, system and computer program for calculating distance (and risk of collision derived from distance), in accordance with a possible embodiment of the invention. The block diagram starts, as input data, from: - a set of points or cloud of points 31 obtained by the 3D sensor (13, 23 in Figures 1A and 1B, respectively) at each instant of time, preferably determined by the sampling frequency of the sensor; This set of points represents the element (s) captured by the sensor. That is, it represents what the sensor captures within its field of vision and range. They can be obstacles or not. The sensor captures data, which forms a cloud of points. The data given by the sensor is represented as a voxel. The sensor does not know if the data collected belongs to an obstacle, or to the device under control, or to a vehicle that carries the device under control, or to anything else. The 3D sensor captures a cloud of points periodically. The period is given by the sampling frequency of the sensor. In general, the point cloud at each instant of time is different from the cloud at the previous time. The sensor can move at the same time as the device under control, or not. - a monitored volume 32 (V1, V2 in Figures 1A and 1B, respectively); This volume represents the three-dimensional space within which it has been decided to look for possible obstacles. This volume 32 is initially parameterized by an expert (for example, in a computer application for configuration). The monitored volume is calibrated at the beginning, referring to the initial geometry of the device under control A (t = 0) explained below. - an initial three-dimensional geometric description of the device under control A (t = 0) (12, 22 in Figures 1A and 1B, respectively); This geometric description is a geometric representation based on primitives (for example, but not limited to, prisms, cylinders, spheres, etc., or combinations thereof) of the device under control. This representation is made by an expert, for example through a configuration application developed for this purpose. Alternatively, it can be done through a design program, such as CAD. That is, in the initial three-dimensional geometric description the device under control is defined as a set of simplified volumes 33. Note that this representation is the initial one because these volumes may vary in position as the device moves (geometry changes A (t)). From the cloud of points 31 obtained by the sensor different operations are performed or stages: (one) Filtering (block 35) of the point cloud 31 obtained by the 3D sensor, from the volume to be monitored (V1, V2) established and represented in block 32. Filtering 35 is performed considering those data that fall within the volume at monitor and that is of interest. That is, at this stage the points of the cloud of points N1 that fall outside the selected volume 32 are excluded, obtaining a cloud of points N2. This filtering 35 to obtain the cloud of points N2 is updated as the sensor takes new data. (2) Preferably, a noise filtering (not specifically illustrated in Figure 2, although it can be understood as included in block 35): The sensor has a certain base noise that generates false detections. To minimize the noise effect associated with the sensor, a minimum number of consecutive voxels occupied threshold is preferably set to be considered as relevant and to be considered as an obstacle. (3) Self-filtering (block 36) of the device under control in the cloud of points N2, so as not to be considered as an obstacle within the volume monitored by the sensor (since in some cases the sensor may be “seeing” elements of the device itself under control). During the execution of the software and method, the three-dimensional geometric description of the device under control (initially considered as A (t = 0)), A (t) (block 34) is updated during the execution of movements. In addition, a simplified three-dimensional geometric description As (t) (block 33) is also obtained. Both the updated data of the geometry of the equipment monitored at time A (t) (block 34) and the updated data of the simplified geometry of the equipment monitored at time As (t) (block 33) are obtained from a device controller (said controller is outside the scope of the present invention). The updated data A (t) preferably feeds the volume self-filtering block (block 36), since A (t) is a more reliable representation of the device under control than As (t). Imagine, for example, a manipulator arm that moves over a vehicle. In an initial moment it is in a fixed position, but if you are going to grab something, its geometry will change: from arm picked to arm extended. This is represented as A (t). This self-filtering (block 36) takes into account the monitored geometric representation (at the instant of time t) A (t), which can be changed at each moment, each time the device is moved under control. Preferably, this self-filtering 36 also has in account, for security, a set of padding parameters (such as a fattening of the geometry) around the geometric volumes that represent A (t). The simplified geometry As (t) is preferably used for the subsequent calculation of distances, as explained below, as it simplifies the calculation. Alternatively, the non-simplified geometry A (t) can be used to calculate the distances. As an example, the controller of the device under control from which the updated geometry data (and also simplified updated geometry) of the device under control is obtained can be the controller of a robot (if a robot is being controlled) or, if a crane, its control cabinet, etc. are being controlled. In one possible embodiment, the three types of operations (1), (2) and (3) are used. From the aforementioned input data (set of points or cloud of points N1 obtained by the 3D sensor in block 31, monitored volume (block 32) and geometric description A (t) and / or As (t) of the device under control), the calculation of the distance that represents a collision risk is carried out as follows: The filtering step (block 35) is performed so that, from the three-dimensional set of data points or cloud of points N1 obtained by the sensor in block 31, and of the monitored volume 32, the set of points N1 is filtered (block 35) to eliminate: the points that are outside the monitored volume 32, and optionally, the points that are considered noise. After this filtering block 35 which results in a reduced set of points or reduced cloud of points N2, a new filtering 36 of the points remaining within the geometry of the device under control is performed, taking into account the monitored geometry of the device under control (preferably the non-simplified monitored geometry A (t) obtained in block 34). Thus, only the points representing one or more obstacles are selected (block 37). These points represent the geometry of the obstacle (s) detected by the sensor within a volume at a given time B (t). Like the monitored geometry of the device under control A (t) obtained in block 34 (or, where appropriate, As (t) obtained in block 33), the geometry of obstacles B (t) detected by the sensor It is also updated as at least the device under control is moved (and optionally the obstacle, which may also be in motion). That is, a subset of points in space B (t) representing an obstacle 14, 24 located within the volume to be monitored V1, V2, 32 is achieved. In other words, a 3D representation B (t) of the elements (obstacles) within the monitored volume 32, from the cloud of filtered points N2 (points captured by the sensor that fall within the selected volume) and of the monitored three-dimensional geometry of the device under control (preferably non-simplified geometry A (t)) at each instant of time. Finally, from the 3D representation B (t) of the obstacles within the monitored volume and the geometric representation of the device under monitored control, a query is made (block 38) on the proximity of both representations, that is to say , a distance between the geometry of the device under control and the geometry of the obstacle B (t) is calculated. For this calculation, the simplified geometry As (t) is preferably used, although alternatively the geometry A (t) can be used. The minimum distance between both geometries is preferably calculated. Depending on the distance obtained, a collision risk can be established based on certain previously defined risk thresholds. These thresholds may vary depending on the applications or industrial sectors in which the present method and computer program is used. In other words, two point clouds are handled all the time: A (t) (and / or As (t)), which represents the geometry of the device at each instant of time, and a set of points (which varies in the time) that represents what the sensor picks up. From this set of points the points that fall outside the monitored volume and the points that correspond to the device under control (if the sensor is seeing part of the device) are eliminated, obtaining B (t). Then, at each instant of time, distances between As (t) and B (t) (or, less preferably, between A (t) and B (t)) are calculated, and at each instant of time the smallest distance is which determines the risk of collision (establishing a risk threshold). This also allows you to obtain the point of the device under control that is most at risk (because we know the geometry), and this point of greatest risk can be represented, for example in a control interface. This process is repeated as often as deemed appropriate depending on the application, sector, travel speed of the device under control, etc. In embodiments of the invention, this process is repeated at intervals of less than 5 minutes, for example but not limited to, less than 1 minute, less than 10 seconds, less than 1 second or less than 100 milliseconds. The method and computer program may also include a block, module or configuration system. Figure 3 shows a complete scheme of the method and computer program, in which the said configuration module 41 has been added to the block diagram of Figure 2. The configuration module 41 provides an interface where to establish the parameters associated to the monitored volume 32 and the geometric description A (t = 0) of the device under control used as inputs by the method or computer program of Figure 2 (or lower part of Figure 3). The configuration module 41 allows defining said parameters through an interface that then uses the computer method and program. The method and computer program are implemented in a device comprising processing means (such as a processor or microprocessor) and memory storage means configured to store the data that is acquired and calculated, such as the different point clouds and distances between device under control and obstacle. The method, system and software can be connected to a control interface. For example, a PC connected with a cable (or wirelessly) can be installed with the control cabinet of the device under control; This PC also connects by cable (or wirelessly) to the sensor. The PC can be in solidarity with the device under control, that is, if it moves in a cart, the PC can be in the cart. An experiment has been carried out in which the method, system and computer program of the invention has been used to monitor an area around a robotic arm hung from a crane. Figure 4 shows the crane and robotic arm of the experiment. The device that implements the method and computer program is a PC processor connected to a 3D sensor and to the control cabinet of the robotic arm under control. The 3D sensor is a LIDAR proximity sensor (Velodyne) and has been placed on the robotic arm. A control interface shows the potential collision points at a given moment. Figure 4 shows a representation of two possible collision points (the two black dots in the image) in the control interface at a given time. As can be seen, a collision risk point detected in the robotic arm is shown and another collision risk point detected in a detected obstacle (in the lower left part of the figure). Dotted lines forming an orthohedron represent the monitored volume. In this text, the word “understand” and its variants (such as “understanding”, etc.) should not be interpreted in an exclusive way, that is, they do not exclude the possibility that what is described includes other elements, steps, etc. On the other hand, the invention is not limited to the specific embodiments that have been described but also covers, for example, the variants that can be made by the average person skilled in the art (for example, as regards the choice of materials, dimensions, components, configuration, etc.), within what follows from the claims.
权利要求:
Claims (16) [1] 1.-A method of proximity detection between a device and an obstacle, characterized by: 5 establish an initial geometric description (A (t = 0)) of a device under control (12, 22) as a set of primitive forms; define (32) a volume to be monitored (V1, V2) around said device under control (12, 22); obtain (31) a set of points in space (N1) using a 3D sensor (13, 10 23); perform a first filtering (35) of said set of points in space (N1) to eliminate from said set of points (N1) the points that are outside of said volume to be monitored (V1, V2), obtaining a subset of points ( N2); monitor (33, 34) the geometry of the device under control (12, 22), updating the geometric description of the device under control (As (t), A (t)) represented as a set of primitive forms; performing a second filtering (36) of said subset of points (N2) to eliminate the points that remain within said updated geometric description (A (t), As (t)) of the device under control (11, 22), obtaining a set of points in the 20 space B (t) representing an obstacle (14, 24) within said volume to be monitored (V1, V2); calculate a distance (D1, D2) between said updated geometric description (A (t), As (t)) of the device under control (12, 22) and said set of points in space B (t) representing said obstacle ( 14, 24). The method of claim 1, wherein said 3D sensor (13, 23) is located in said device under control (12, 22) or close to it. [3] 3. The method of any of the preceding claims, wherein said 3D sensor (13, 23) moves at the same time as the device under control (12, 22) is moved or at the same time as a device (11 , 21) carrying or transporting said device under 30 control (12, 22). [4] 4.- The method of any of the preceding claims, wherein said volume to be monitored (V1, V2) is moved at the same time as said device is moved under control (12, 22) or at the same time as a device is moved ( 11, 21) carrying or transporting said device under control (12, 22). [5] 5. The method of any of the preceding claims, wherein in said first filtering (35) of said set of points in space (N1) obtained by a 3D sensor (13, 23) the points that represent are also eliminated noise. [6] 6. The method of any of the preceding claims, wherein said distance (D1, D2) calculated between said updated geometric description (A (t), As (t)) of the device under control (12, 22) and said set of points in space B (t) representing said obstacle (14, 24) is a minimum distance. [7] 7. The method of any of the preceding claims, wherein in said monitoring stage (33) a simplified geometric description (As (t)) of the device under control (12, 22) is obtained. [8] 8. The method of claim 7, wherein said simplified geometric description (As (t)) of the device under control (11, 12) is used to calculate said distance (D1, D2). [9] 9. The method of any of the preceding claims, further comprising a step of, from said distance (D1, D2), establishing a risk of collision (R1, R2) between said device under control (12, 22) and said obstacle (14, 24) or selecting a set of collision risk points located in said device under control (12, 22) and / or in said obstacle (14, 24). [10] 10. The method of any of the preceding claims, wherein the above steps are repeated with a certain frequency, the updated geometric description (A (t), As (t)) of the device under control (12) being updated with said frequency. , 22) and the set of points in space B (t) representing said obstacle (14, 24), thus recalculating said distance (D1, D2). [11] 11. The method of any of the preceding claims, further comprising the step of calibrating at the beginning the relative position between the sensor (13, 23), the device under control (12, 22) and the volume to be monitored (V1, V2). [12] 12.- A proximity detection system between a device and an obstacle, characterized by: means for establishing an initial geometric description (A (t = 0) of a device under control (12, 22) as a set of primitive forms; means for defining (32) a volume to be monitored (V1, V2) around said device under control (12, 22); means for obtaining (31) a set of points in space (N1) by a 3D sensor (13, 23); means for performing a first filtering (35) of said set of points in space (N1) to eliminate from said set of points (N1) the points that are outside of said volume to be monitored (V1, V2), obtaining a subset of points (N2); means for monitoring (34) the geometry of the device under control (12, 22), updating the geometric description of the device under control (A (t), As (t)) represented as a set of primitive forms; means for performing a second filtering (36) of said subset of points (N2) to eliminate the points that remain within said updated geometric description (A (t), As (t)) of the device under control (11, 22), obtaining a set of points in space B (t) that represent an obstacle (14, 24) within said volume to be monitored (V1, V2); means for calculating a distance (D1, D2) between said updated geometric description (A (t), As (t)) of the device under control (12, 22) and said set of points in space B (t) representing said obstacle (14, 24). [13] 13. The system of claim 12, further comprising a configuration module (41) configured to provide an interface in which to establish said volume to be monitored (V1, V2) and said initial geometric description (A (t = 0) ) of the device under control (12, 22). [14] 14. The system of any of claims 12 or 13, further comprising memory storage means. [15] 15. The system of any of claims 12 to 14, further comprising a control interface configured to visualize the possible collision points between the device under control (12, 22) and the obstacle (14, 24). [16] 16. A computer program comprising computer program code instructions for performing the method according to any of claims 1 to [11] eleven. 14 V1 D1, R1 FIG. 1A FIG. 1 B FIG. 2 FIG. 3 FIG. 4
类似技术:
公开号 | 公开日 | 专利标题 US9213934B1|2015-12-15|Real time explosive hazard information sensing, processing, and communication for autonomous operation Reina et al.2016|Ambient awareness for agricultural robotic vehicles JP6278539B2|2018-02-14|Flight mode selection based on situation US10328575B2|2019-06-25|Method for building a map of probability of one of absence and presence of obstacles for an autonomous robot EP3164769B1|2021-09-15|Machine safety dome Correa et al.2012|Mobile robots navigation in indoor environments using kinect sensor US8355818B2|2013-01-15|Robots, systems, and methods for hazard evaluation and visualization US8271132B2|2012-09-18|System and method for seamless task-directed autonomy for robots US7801644B2|2010-09-21|Generic robot architecture US7620477B2|2009-11-17|Robotic intelligence kernel US20080009965A1|2008-01-10|Autonomous Navigation System and Method JP6863991B2|2021-04-21|Virtual line tracking method and modification method for autonomous mobile robots and mobile robots Papachristos et al.2019|Autonomous navigation and mapping in underground mines using aerial robots BR112020024333A2|2021-02-23|track vehicles in a warehouse environment EP3660231B1|2022-02-23|System and method for autonomous operation of heavy machinery Khan et al.2017|Stereovision-based real-time obstacle detection scheme for unmanned ground vehicle with steering wheel drive mechanism Liu et al.2016|The design of a fully autonomous robot system for urban search and rescue KR20210058998A|2021-05-24|Autonomous Map Driving Using Waypoint Matching ES2681123B1|2019-06-18|METHOD, SYSTEM AND INFORMATIC PROGRAM OF PROXIMITY DETECTION Madhevan et al.2017|Modelling, simulation and mechatronics design of a wireless automatic fire fighting surveillance robot Hines et al.2021|Virtual surfaces and attitude aware planning and behaviours for negative obstacle navigation EP3660227A1|2020-06-03|System and method for preventing rolling-over of vehicles Karakaya et al.2014|Development of a human tracking indoor mobile robot platform Pȩszor et al.2018|Optical flow for collision avoidance in autonomous cars Hart et al.2015|A low-cost robot using omni-directional vision enables insect-like behaviors
同族专利:
公开号 | 公开日 ES2681123B1|2019-06-18| WO2018162781A1|2018-09-13|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
法律状态:
2018-09-11| BA2A| Patent application published|Ref document number: 2681123 Country of ref document: ES Kind code of ref document: A1 Effective date: 20180911 | 2019-06-18| FG2A| Definitive protection|Ref document number: 2681123 Country of ref document: ES Kind code of ref document: B1 Effective date: 20190618 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 ES201730325A|ES2681123B1|2017-03-10|2017-03-10|METHOD, SYSTEM AND INFORMATIC PROGRAM OF PROXIMITY DETECTION|ES201730325A| ES2681123B1|2017-03-10|2017-03-10|METHOD, SYSTEM AND INFORMATIC PROGRAM OF PROXIMITY DETECTION| PCT/ES2018/070176| WO2018162781A1|2017-03-10|2018-03-09|Method, system and computer program for detecting proximity| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|