![]() SYSTEM AND METHOD FOR LOCATING IMPACTS ON EXTERNAL SURFACE
专利摘要:
The invention relates to a method for locating impacts on an external surface (110) of a body (A), characterized in that it comprises the following steps: a) modeling the body in a first database of a control unit (300) for obtaining a virtual model of said body in a virtual landmark; b) modeling, in a second database of the control unit, a plurality of point clouds in the virtual coordinate system, each point cloud, said inspection zone, representing a portion of the external surface of the body (AT) ; c) selection of an inspection area via a man-machine interface of the control unit; d) transferring, by a central unit of the control unit, the coordinates of each point of the first and the second database of the virtual coordinate system to a geographical reference; e) determination of the coordinates in the geographical reference of the initial position of a flying drone (200) in communication with the central unit, the drone being equipped with a rangefinder (243); f) calculating, by the central unit, a flight plan of the drone so as to scan, via the rangefinder, the selected inspection area; g) creation, by the central unit, of a 3D mesh of the selected inspection area scanned by the range finder; h) detection of the impacts, by the central unit, by a comparison between the 3D mesh and the virtual model of said aircraft and calculation of the coordinates of each impact in the geographical reference and the virtual reference mark. 公开号:FR3038109A1 申请号:FR1556048 申请日:2015-06-29 公开日:2016-12-30 发明作者:Ronie Gnecco;Denis Gattone;David Martinez;Jean-Louis Guittard 申请人:Airbus Operations SAS; IPC主号:
专利说明:
[0001] The present invention relates to a system and a method for determining the coordinates of impacts occurring on the external surface of a body / vehicle, such as, for example, depressions of the outer surface. (or skin) of an aircraft following the fall of an object or a collision with a bird. [0002] The location of impacts on the outer surface (or skin) of an aircraft is currently achieved using a tape measure to measure the distances of the impact compared to references that are structural elements such as frames, smooth or numbered ribs forming the internal structure of the fuselage or wings. Such an operation mobilizes a plurality of technicians and is slow to implement, especially when it requires the establishment of scaffolding to reach an upper part of the aircraft where the impact is located. The object of the invention is to locate impacts on the outer surface of a fuselage in a faster and simpler way. For this purpose, the invention relates to a method for locating impacts on an external surface of a body comprising the following steps: a) modeling the body in a first database of a control unit in order to obtain a virtual model of said body in a virtual landmark; b) modeling, in a second database of the control unit, a plurality of point clouds in the virtual coordinate system, each point cloud, said inspection zone, representing a portion of the external surface of the body ; c) selection of an inspection area via a man-machine interface of the control unit; d) transferring, by a central unit of the control unit, the coordinates of each point of the first and the second database of the virtual coordinate system to a geographical reference; e) determination of the coordinates in the geographical reference of the initial position of a flying drone in communication with the central unit, the drone being equipped with a rangefinder; f) calculating, by the central unit, a flight plan of the drone so as to scan, via the rangefinder, the selected inspection area; 3038109 2 g) creation, by the central unit, of a 3D mesh of the selected inspection area scanned by the range finder; h) detection, by the central unit of the impacts, by a comparison between the 3D mesh and the virtual model of said aircraft and calculation, by the central unit, of the coordinates of each impact in the geographical reference and the virtual coordinate system. The invention also relates to the system for implementing this method. The invention thus makes it possible, via the use of a drone and the choice of an inspection zone of the aircraft by the drone, to quickly detect and locate impacts on the outer surface 110 of said aircraft A. [0003] The characteristics of the invention mentioned above, as well as others, will appear more clearly on reading the following description of exemplary embodiments, said description being given in relation to the attached drawings, among which: FIG. 1 is a diagram of the impact location system according to one embodiment of the invention for detecting and locating impacts on the external surface of an aircraft, said system comprising a flying drone in communication with a control unit on the ground; - Figure 2 is a schematic view of the flying drone shown in Figure 1; FIG 3 is a schematic view of the control unit shown in Figure 1 and 4 is a diagram illustrating an implementation of the system according to the invention 20 to locate an impact and determine the coordinates of said impact. on the outer surface of an aircraft using a virtual model of said aircraft and a flying drone. According to the invention and with reference to FIGS. 1 to 3, the impact locating system 100 on the outer surface 110 of an aircraft A comprises a flying drone 200 provided with at least one telemeter 243 for scanning said surface 110, and a control unit 300 placed on the ground ensuring both the control functions of the drone 200 and data processing provided by the latter. The flying drone 200, more particularly illustrated in FIG. 2, is for example a quadrocopter-type drone comprising a frame 220 having a central portion 230. The latter, of circular shape, carries an electronic charge 240 and four extensions 250 extending outside 3038109 3 of the central portion 230. Each extension carries at its free end a propulsion unit 260 consisting of a motor driving a propeller. The electronic load 240 is composed of a controller 241 connected to various sensors 242, a telemeter 243 configured to scan the outer surface 110 of the aircraft, and a signal transmitting / receiving means 244 for communicating with each other. with the ground control unit. The rangefinder 243 equipping the drone is, for example, an optical, acoustic or radar rangefinder. Preferably, it is an optical rangefinder type LIDAR (for laser detection and ranging) carried by the frame 220, for example under the latter. The LIDAR 10 comprises, in a known manner, a pulsed laser transmitter, a receiver comprising a light collector, a photo detector transforming the collected light into an electrical signal, as well as an electronic chain for processing said signal. The LIDAR calculates the distance separating the drone 200 from the external surface 110 of the aircraft from the time difference between the emission, by the laser transmitter, of a pulse and the reception, by the photo detector, of the detected luminous echo reflected by the outer surface 110. The measured distances, called telemetry data, make it possible to determine the coordinates of the points of the scanned surface and to make a 3D mesh of the latter. The sensors 242 of the drone comprise at least one position sensor and an orientation sensor. The orientation sensor determines the inclination of the attitude of the drone with respect to a horizontal plane H (the ground), and is for example of the type of gyro sensors, or inclinometers. The position sensor makes it possible to give the position of a point of the drone in the form of coordinates in a geographical reference (latitude, longitude, altitude) and is a receiver of satellite signals, for example GPS signals (for Global Positioning System: 25 global positioning system). In the remainder of the description, it will be considered that all the satellite signals used are GPS signals. It should be noted that the position of the range finder 243 with respect to the satellite signal receiver is known and stored in a memory of the controller 241. This position will be used for the realization of a 3D mesh of the scanned surface as will be described in FIG. 30 following the description. The controller 241 is configured to interface between all the components of the drone 200 and is configured to maintain the flight of the stable drone while obeying flight instructions (flight plan) supplied to the drone by the flight control unit. control 300 via the signal transmitting / receiving means 244. The signal transmission / reception means 244 are, for example, formed by a radiofrequency transmitter / receiver making it possible to transfer data from the drone 200 to the control unit 300 and to receive flight instructions from the latter . The data of the drone comprise in particular the data of the sensors 242 of the drone as well as the telemetric data provided by the telemeter 243. It will be noted that the drone 200 is furthermore equipped with protection devices of the type of the proximity sensors (not shown) mounted on the chassis 220 of the drone so as not to collide with the aircraft to be inspected. Such a device, connected to the controller 241, is for example an optical proximity sensor operating in analog mode where the amplitude of the signal it provides to the controller 241 is a function of the relative position of the objects close to the drone 200 and located in the detection field of said sensor. When an object, such as, for example, the outer surface 110 of the aircraft is located less than a predetermined distance from the drone 200, the sensor signal exceeds a threshold value. In this case, the controller 241 controls the flight of the drone so as to move the latter away from the outer surface 110 until the amplitude of the signal is below said threshold value. Preferably, the predetermined distance is of the order of 3m, or dim. [0004] With reference to FIG. 3, the ground control unit 300 comprises various elements interfaced with each other including: a central unit 310 managing the operation of the control unit 300 and the control of the drone 200. The central unit implements different computer programs and includes memories in which two databases DB1 and DB2 are recorded; A man-machine interface 320 through which an operator can interact with the control unit 300; a display device 330; a camera 340, for example of the CMOS (complementary metal-oxide semiconductor) or CCD (charge-coupled device) type, making it possible to take images that can be displayed on the display device 330; a communication module 350 for transmitting and receiving data; a module for determining the position of the aircraft 360 configured to determine the position and the orientation of the aircraft A inspected with respect to the control unit 300; A control module 370 receiving data from the sensors of the drone and which is configured to calculate a flight plan of the drone 200 to the outer surface 110 of the aircraft to inspect and pilot the flight of the drone according to this plan; an impact detection module 380 configured to detect and locate an impact on the outer surface 110 of the aircraft to be inspected according to the telemetry data provided by the drone 200 and the position of the latter; and satellite positioning means 390, for example GPS satellite signal receivers, for determining the coordinates in the geographical reference (latitude, longitude, altitude) of the ground control unit. [0005] The central unit 310 implements an operating system (OS) which, in a known manner, enables an operator of the control unit to interact, via the man machine interface 320, with this device. last, for example by enlarging / reducing, pointing to certain elements displayed on the display device 330, .... Preferably, the control unit 300 is a touch pad of the type commercially available where the The touch screen of the tablet combines the functions of the display device 330 and the man-machine interface 320. The communication module 350 is, for example, a transceiver operating with radio frequency signals. This module makes it possible to receive data from the drone 200 and to transmit data to it. In addition, the communication module 350 allows the connection and exchange of data with an external network NET which is for example the network of an airline or an aircraft manufacturer. The external network NET comprises a remotely searchable shared database where each IDN ID number (registration, series, tail number) of an aircraft is associated with its model (Airbus A350, 25 Airbus A380, Airbus A320, or model aircraft). 'other manufacturers') and the coordinates in the geographical reference of a satellite signal receiver installed in the fuselage of the aircraft. Each of the module for determining the position of the aircraft 360, flight control 370 of the drone and impact detection 380 is a program implemented by the central unit 310 and is stored in a memory thereof. Each program receives as input 30 data from different elements of the control unit 300. The operation of each of these programs will be described later in the description in connection with a method of locating an impact on an aircraft by means of of the system 100 according to the invention. [0006] The aircraft position determination module 360 and the impact detection module 380 use the first database DB1 containing data relating to different aircraft models on which the system 100 can be used. according to the invention. The flight control module 370 makes use of the second database DB2 5 containing, for each aircraft model, data relating to specific inspection areas of the outer surface 110 of the aircraft. The first database DB1 includes data obtained by three-dimensional assisted modeling of the entirety of different models of aircraft modeled. Thus this database comprises, for each model of modeled aircraft, a virtual model V (or DMU: Digital Mock-Up) which comprises the modeling of all the elements of the aircraft A, such as the external surface 110 (skin) of the aircraft, the structural elements forming the internal structure of the aircraft (frames, rails, floors, ribs ...), the distinctive elements of the outer surface of the aircraft (for example: portholes, doors, probes, antennas, motors, hatches, shutters ...) or the receiver of satellite signals equipping the aircraft. An operator accessing the first DB1 database via the operating system is able to view a virtual model V of the aircraft on the display device 330 and to determine the coordinates of each point of a modeled element. of the aircraft in a virtual coordinate system O, X, Y, Z. The second database DB2 comprises the coordinates, in the virtual coordinate system 20 O, X, Y, Z, of a plurality of point clouds where each of said clouds represents a portion of the outer surface 110 of each aircraft modeled in the first database. As will be described below, each cloud of points forms a potential scan area for the flight control drone's rangefinder 243. In the rest of the description, a point cloud will be called an inspection zone. For example, an inspection zone is the upper part of the left wing of the aircraft, the upper part of the fuselage at the nose of the aircraft, the cone (radome) of the aircraft, the left part of the drift ... The location of an impact by an operator of an entity responsible for the inspection of the aircraft A (airline, manufacturer) on the outer surface 110 of an aircraft using the system 100 according to the invention will now be described in relation to FIG. [0007] In a first step E1, the operator wishing to inspect the fuselage of an aircraft locates the identifier of the aircraft IDN and, by means of the human-machine interface 320, sends this identifier to the external network NET via the communication module 350. In response, the network NET provides the central unit 310 the aircraft model carrying this IDN identifier and the 5 coordinates in the geographical reference point of a point corresponding to the position of the satellite signal receiver equipping the aircraft A. In a second step E2, the central unit 310 determines the position and the orientation of the aircraft A to transform the coordinates of the virtual model V registered in the first database DB1 and the points of the second DB2 database of the virtual coordinate system O, X, Y, Z to the geographical landmark so that the central unit 310 can subsequently calculate a flight plan of the drone 200 to an inspection area. For this purpose, in a first substep Ela of the second step E2, the central unit 310 executes the module for determining the position of the aircraft 360 in order to determine the distance between the satellite signal receiver of the aircraft A and the satellite positioning means 390 of the central unit; each of the aircraft receiver and satellite positioning means of the central unit being embodied in the form of a point having its coordinates in the geographical reference. Then, in a second sub-step E2b, the central unit 310 executes said position determination module of the aircraft 360 to determine the orientation of the aircraft A. To do this, via a message displayed on the aircraft device 330, the module for determining the position of the aircraft 360 requires the operator to take, via the camera, an image of the aircraft A on which are present two symmetrical elements of the aircraft: it is act for example to take a front image of the aircraft on which are visible two motors symmetrical to each other in the sagittal plane (passing through its longitudinal axis) of the aircraft, such as for example the pair of engines located closest to the fuselage. The central unit 310 displays the image on the display device 330 and requests the user to point, with a first cursor, the center of the left engine fan and, with a second cursor, the center of the fan the right symmetrical motor. Once this is done, the CPU 310 calculates the distance to the image between the two cursors and compares it to the distance, written in the first database DB1, 3038109 8 between the two motors in a plane perpendicular to the longitudinal axis of the aircraft. From this information, and in particular from the position of the cursors to the image and the distance calculated in the sub-step Ela, the central unit calculates the angle between the longitudinal axis of the aircraft and the unit of 300 command and thus deduces the orientation of the longitudinal axis of the aircraft A. [0008] In a third substep Etc, the position determination module of the aircraft 360 knowing the position and the orientation of the aircraft transposes, from the virtual coordinate system O, X, Y, Z to the geographical reference, the coordinates of the points of the elements modeled in the first and the second database DB1, DB2. Consequently, following the step E2, at each point of the virtual model V registered in the database DB1 are associated coordinates in the geographical reference point of the same point of the aircraft A. At the third step E3, the central unit executes the flight control module 370 which uses the second database DB2 to determine the inspection areas available for the aircraft model considered. Via a message displayed on the display device 330, the flight control module 370 requests the operator to select one of the inspection areas available for the particular aircraft model. It should be noted that the order of steps E3 and E2 could be reversed. In a step E4, once the choice of the inspection zone has been made and the position and orientation of the aircraft A determined, the central unit 310 records the position of the drone 200 in the geographical reference frame and executes the module. 370 to calculate a flight plan of the drone 200. The position of the drone 200 is provided by the sensors 242 of the drone. The flight plan is calculated to ensure a flight of the drone 200 from the initial position of the drone (usually on the ground) to the selected inspection area around which the drone follows a line of points, then the zone of inspection to its original position. The flight plan is calculated so that the drone 200 does not approach the aircraft within 1 meter, preferably 3 meters. In a fifth step E5, the central unit 310 executes the flight control module 370 which provides flight instructions to the controller 241 of the drone so that the latter follows the 3038109 9 points of the flight plan calculated in step E4 . During the flight of the drone 200, the sensors 242 of the drone provide information to the control unit 300 and the control module 370 controls the drone 200 to scan the inspection area via the rangefinder 243. Simultaneously, the central unit 310 executes the detection and location module 380 to analyze the telemetry data received from the drone. The detection module establishes a 3DMAP 3D mesh of the scanned inspection area where each point of the 3DMAP mesh has its coordinates in the geographical reference. In a sixth step E6, at the end of the flight of the drone 210, the central unit 310 executes the detection and location module 380 which compares the 3DMAP 3D mesh with the virtual model 10 V of the aircraft in order to detect impacts. IMP. For this purpose, the areas where the depth differences between the 3D mesh and the external surface of the virtual model V are greater than a predefined value, for example of the order of 1 mm, are considered IMP impacts. For each impact IMP reported, the central unit 310 calculates the geographical coordinates of the IMP impact. [0009] In a seventh step E7, the CPU 310 transfers the coordinates of each impact to the O, X, Y, Z coordinate and marks the IMP impact by means of a pointer to the virtual model V displayed on the device. 330 display. In addition, on this virtual model, the central unit points the frame, the smooth or the rib closest to the IMP impact. Finally, at the request of the operator, the central unit 310 requires the registration in a shared database of the network NET of the coordinates, in the frame O, X, Y, Z of each impact IMP detected on the aircraft, as well as the number of smooth, frames, ribs that are closest to this impact. The invention makes it possible to quickly make available the historical impact data of an aircraft and rapidly provides the position of an impact with respect to hull numbers, frames, ribs to the entity in charge of the maintenance. of the aircraft. An operator equipped with the system 100 according to the invention may, during a subsequent maintenance visit, access the shared database to consult the history of the aircraft. The invention thus makes it possible, via the use of a drone and the choice of an inspection zone of the aircraft by the drone, to quickly detect and locate IMP impacts on the outer surface 110 of said aircraft A. [0010] In addition, since the inspection areas available for an aircraft model in question are prerecorded in the database DB2, the system 100 according to the invention requires little knowledge of the structure of the aircraft A to the operator and thus allows a large number of operators to achieve this location. The system 100 according to the invention thus has the advantage of being usable by an operator who is unfamiliar with the structure of the aircraft concerned, such as a personnel on the ground in an airport. Thanks to the invention, the planning of aircraft inspection visits is not subject to the constraints of availability of a small number of people. The invention is applicable, in addition to an aircraft A, to other bodies such as for example hulls of boats. Although described as being a quadrocopter drone, the drone 200 of the system 100 according to the invention could be of another type of flying drone, such as for example a helicopter drone equipped with a single propulsion unit. In a variant not shown in the figures, the drone 200 of the system according to the invention 15 advantageously comprises a video camera connected to the controller 241, for example of the CMOS or CCD type, so as to take images of the scanned surface by the rangefinder. . The video taken by the video camera is viewable by the operator of the control unit and can be shared on the NET network. The central unit 310 is configured to associate the recording time of the video 20 with the GPS coordinates of the points scanned at the moment of the shooting so that an operator viewing the video is able to determine the coordinates of the video. IMP impact on the scanned inspection area. Finally, the drone 200 advantageously comprises as a position sensor, in addition to the satellite signal receiver, an accelerometer for determining the position of the drone from a last known position provided by the satellite signal receiver. The advantage is to be able to use the drone 200 in areas where the level of reception of satellite signals is low since, in this case, during a loss of satellite signal, the accelerometer is in addition to the satellite receiver to provide the position of the drone at the controller 241 or at the central unit 310.
权利要求:
Claims (12) [0001] CLAIMS1) Impacts location method (IMP) on an outer surface (110) of a body (A), characterized in that it comprises the following steps: a) Modeling the body in a first database (DB1 ) a control unit (300) to obtain a virtual model (V) of said body in a virtual coordinate system (O, X, Y, Z); b) modeling, in a second database (DB1) of the control unit, a plurality of point clouds in the virtual coordinate system, each point cloud, called inspection zone, representing a portion of the surface external body (A); c) selecting (E3) an inspection area via a human machine interface (320) of the control unit; d) transferring (E2, E2c), by a central unit (310) of the control unit, coordinates of each point of the first and second database (DB1, DB2) of the virtual coordinate system to a geo-graphical reference ; e) determination (E4) coordinates in the geographical reference of the initial position of a flying drone (200) in communication with the central unit, the drone being equipped with a rangefinder (243); f) calculating (E4), by the central unit (310), a flight plan of the drone so as to scan, via the rangefinder, the selected inspection area; g) creating (E5), by the central unit, a 3D mesh (3DMAP) of the selected inspection area scanned by the range finder (243); h) detection (E6), by the central unit of the impacts (IMP), by a comparison between the 3D mesh (3DMAP) and the virtual model of said aircraft (V) and calculation (E7), by the central unit, of coordinates of each impact (IMP) in the geographic and virtual landmarks. [0002] 2) Process according to claim 1, characterized in that the body (A) comprises an internal structure formed of a plurality of structural elements, the method comprises following the step h) of impact detection (IMP) and of calculating the coordinates of each impact (IMP), an additional step i) of determining, on the virtual model (V), the structural elements closest to each impact. 3038109 12 [0003] 3) Method according to any one of claims 1 to 2, characterized in that it comprises following step h) impact detection (IMP) and calculation of the coordinates of each impact (IMP), an additional step j) recording (E7) in a shared base of the coordinates of each impact (IMP) in the virtual coordinate system. 5 [0004] 4) Method according to any one of claims 1 to 3, characterized in that the step d) of transfer comprises a step dl) determination (E2a), by the central unit (310), the position of the body in the geographical reference. [0005] 5) Method according to any one of claims 1 to 4, characterized in that the step d) 10 of transfer comprises a step d2) determination (E2b), by the central unit (310), the orientation of the body in the geographical reference. [0006] 6) Method according to claim 4, characterized in that the step d1) of determining the position of the body comprises: dia) determining the coordinates of the control unit (300) in the geographical reference via positioning means satellites (390) of the control unit; dlb) determining (El), by the operator, an identifier (IDN) of the body; click) interrogation (El), via the human machine interface (320), of an external network (NET) 20 with the identifier (IDN) of the body so as to receive the coordinates of the body in the geographical reference. [0007] 7) A method according to claim 5, characterized in that the step d2) of determining the orientation of the body comprises: d2a) the shooting by the operator, via a camera (340) of the unit of control, an image of the body (A) on which are present a first and a second body element symmetrical with respect to a sagittal plane of said body; d2b) pointing at a display of the control unit (300), via the man-machine interface (330), the first element with a first cursor and the second element with a second cursor d2c ) calculating, by the central unit (310), the distance to the image between the first and the second cursor; 3038109 13 d2d) the comparison, by the central unit (310), of the calculated distance and the distance between the first and the second element entered in the first database (DB1). d2e) calculation by the central unit of the angle between the longitudinal axis of the body and the ground control unit. 5 [0008] 8) Impact locating method according to any one of claims 1 to 7, characterized in that the step h) impact detection (IMP) and calculation of the coordinates of each impact (IMP) comprises: hl. ) the comparison between the 3D mesh (3DMAP) and the virtual model (V) of the aircraft so as to determine differences between said mesh and the virtual model of the aircraft; 10 h 2) the comparison of each deviation with a predetermined deviation value. [0009] 9) A method of locating impacts according to any one of claims 1 to 8, characterized in that the body is an aircraft (A). 15 [0010] 10) System (100) for locating impacts on an external surface (110) of a body (A) for carrying out the method according to any one of Claims 1 to 9, characterized in that the device comprises : 20 - a flying drone (200) equipped with a range finder (243) and at least one position sensor (242); and a control unit (300) in communication with the drone and an external network (NET), the control unit comprising a central unit (310), a first database (DB1) comprising data obtained by modeling three-dimensional assisted body, and a second database (DB2) comprising the coordinates of a plurality of point clouds, said inspection areas, or each inspection area represents a portion of the outer surface of the body; the control unit (300) further comprising a man-machine interface (320), a display device (330), a camera (340) and satellite positioning means (390) 3038109 [0011] 11) System (100) of location of impacts according to claim 10, characterized in that the rangefinder (243) is taken from the following rangefinders: acoustic range finder, range finder, optical range finder. [0012] 12) System (100) location of impacts according to any one of claims 10 to 11, characterized in that the drone comprises an accelerometer. 5 10
类似技术:
公开号 | 公开日 | 专利标题 FR3038109A1|2016-12-30|SYSTEM AND METHOD FOR LOCATING IMPACTS ON EXTERNAL SURFACE EP2513668B1|2018-01-24|Method for geo-referencing an imaged area US10705193B2|2020-07-07|Monitoring of vehicles using light ranging systems CN106233219B|2020-03-17|Mobile platform operating system and method EP1724592B1|2018-04-04|System for estimating the speed of an aircraft and its application to the detection of obstacles CA2853546A1|2013-05-02|Identification and analysis of aircraft landing sites US20190037207A1|2019-01-31|Collaborative stereo system for three-dimensional terrain and object reconstruction Shi et al.2017|Shipborne over-and under-water integrated mobile mapping system and its seamless integration of point clouds US10533858B2|2020-01-14|Automated emergency response WO2018209898A1|2018-11-22|Information processing device, aerial photographing path generation method, aerial photographing path generation system, program and recording medium FR3030091A1|2016-06-17|METHOD AND SYSTEM FOR AUTOMATICALLY DETECTING A DISALLIATION IN OPERATION OF A MONITORING SENSOR OF AN AIRCRAFT. US20200098189A1|2020-03-26|System and Methods for Data Transmission and Rendering of Virtual Objects for Display US11175399B2|2021-11-16|Information processing device, information processing method, and storage medium CN111308415A|2020-06-19|Online pose estimation method and device based on time delay Maguire2014|Using Unmanned Aerial Vehicles and" Structure from Motion" Software to Monitor Coastal Erosion in Southeast Florida KR102130687B1|2020-07-07|System for information fusion among multiple sensor platforms US20210383144A1|2021-12-09|Geolocation with aerial and satellite photography JP7020418B2|2022-02-16|Information processing equipment, information processing methods, and programs KR102239562B1|2021-04-13|Fusion system between airborne and terrestrial observation data US20200218289A1|2020-07-09|Information processing apparatus, aerial photography path generation method, program and recording medium Monfort et al.2011|Assimilation of airborne imagery with a wave model for bathymetric estimation FR3080839A1|2019-11-08|SYSTEM AND METHOD FOR EXTERNAL SURFACE INSPECTION EP2946367B1|2018-03-07|A method and an apparatus for estimating values for a set of parameters of an imaging system CN110617800A|2019-12-27|Emergency remote sensing monitoring method, system and storage medium based on civil aircraft EP3655725A1|2020-05-27|Method for estimating the movement of an object moving in a magnetic field environment
同族专利:
公开号 | 公开日 US20160378895A1|2016-12-29| US9830411B2|2017-11-28| FR3038109B1|2018-07-27|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 WO2012044297A1|2010-09-30|2012-04-05|Empire Technology Development Llc|Automatic flight control for uav based solid modeling| US20140046589A1|2011-04-14|2014-02-13|Hexagon Technology Center Gmbh|Measuring system for determining 3d coordinates of an object surface| US20140168420A1|2011-04-26|2014-06-19|Eads Deutschland Gmbh|Method and System for Inspecting a Surface Area for Material Defects| US20140172357A1|2012-12-18|2014-06-19|Sharper Shape Ltd|Analysis of sensor data| US8756085B1|2013-03-15|2014-06-17|State Farm Mutual Automobile Insurance Company|Systems and methods for assessing property damage| US20140267627A1|2013-03-15|2014-09-18|State Farm Mutual Automobile Insurance Company|Methods and systems for capturing the condition of a physical structure|CN113252294A|2021-06-16|2021-08-13|西南交通大学|Cross-sea bridge space wind speed and direction testing system and monitoring method|US4947351A|1988-05-06|1990-08-07|The United States Of America As Represented By The Secretary Of The Air Force|Ultrasonic scan system for nondestructive inspection| US6014024A|1995-07-31|2000-01-11|Battelle Memorial Institute|Apparatus and method for detecting and/or measuring flaws in conductive material| US5760904A|1996-07-26|1998-06-02|General Electric Company|Method and system for inspecting a surface of an object with laser ultrasound| US20060186260A1|2005-02-18|2006-08-24|American Airlines, Inc.|Method of inspecting aircraft| GB2493929B|2011-08-22|2015-11-25|Bae Systems Plc|Determining impact damage in a composite member by acoustic wave processing| WO2015081383A1|2013-12-04|2015-06-11|Spatial Information Systems Research Ltd|Method and apparatus for developing a flight path|US10527423B1|2016-04-07|2020-01-07|Luftronix, Inc.|Fusion of vision and depth sensors for navigation in complex environments| US10170011B2|2016-07-26|2019-01-01|International Business Machines Corporation|Guide drones for airplanes on the ground| US10820574B2|2016-07-29|2020-11-03|International Business Machines Corporation|Specialized contextual drones for virtual fences| US9987971B2|2016-07-29|2018-06-05|International Business Machines Corporation|Drone-enhanced vehicle external lights| US10067502B1|2017-03-07|2018-09-04|Toyota Research Institute, Inc.|Service drone configuration based on a serviceable vehicle-component fault condition| GB2565757A|2017-07-13|2019-02-27|Sita Information Networking Computing Uk Ltd|Database of Drone flight plans for aircraft inspection using relative mapping| FR3084485A1|2018-07-26|2020-01-31|Donecle|MOTORIZED FLYING MACHINE FOR MEASURING THE RELIEF OF SURFACES OF A PREDETERMINED OBJECT AND METHOD FOR CONTROLLING SUCH A MACHINE| US20200377233A1|2019-05-29|2020-12-03|Rizse, Inc.|Uav-based aviation inspection systems and related methods| CN112053361B|2020-10-15|2021-11-09|南京航空航天大学|Aircraft skin butt joint detection method based on large-scale point cloud|
法律状态:
2016-06-27| PLFP| Fee payment|Year of fee payment: 2 | 2016-12-30| PLSC| Search report ready|Effective date: 20161230 | 2017-06-21| PLFP| Fee payment|Year of fee payment: 3 | 2018-06-26| PLFP| Fee payment|Year of fee payment: 4 | 2020-06-19| PLFP| Fee payment|Year of fee payment: 6 | 2021-06-22| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1556048A|FR3038109B1|2015-06-29|2015-06-29|SYSTEM AND METHOD FOR LOCATING IMPACTS ON EXTERNAL SURFACE| FR1556048|2015-06-29|FR1556048A| FR3038109B1|2015-06-29|2015-06-29|SYSTEM AND METHOD FOR LOCATING IMPACTS ON EXTERNAL SURFACE| US15/193,497| US9830411B2|2015-06-29|2016-06-27|System and method for locating impacts on an external surface| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|