专利摘要:
automated visual inspection system. the present invention relates to a method and an apparatus for inspecting an object. in response to the presence of the object in an inspection area, a volume containing the object is identified, the volume has a plurality of parts of the volume. A number of sensor systems are designed for the plurality of parts of the volume each sensor system in the quantity of the sensor systems is designed for a number of parts in the plurality of parts of the volume based on each sensor system is capable of generating the data with a desired level of quality on a surface of the object in a particular part in the plurality of parties. the data on the object's surface is generated using the number of sensor systems designed for the plurality of parts of the volume, a determination is made as to whether an amount of inconsistency is present on the object's surface using data.
公开号:BR112013008075B1
申请号:R112013008075-2
申请日:2011-09-15
公开日:2021-04-06
发明作者:Soon J Jung
申请人:The Boeing Company;
IPC主号:
专利说明:

[0001] [0001] The present disclosure generally refers to the inspection of objects and, in particular, the inspection of an aircraft. Even more particularly, the present disclosure relates to a method and apparatus for automatically inspecting an aircraft on the ground. 2. Fundamentals
[0002] [0002] Aircraft and parts of the aircraft are inspected during different phases of the aircraft's life. For example, when an aircraft is being assembled, the different parts of the aircraft are inspected during various stages of assembly. In addition, during testing and certification of an aircraft, inspections are carried out to determine whether different parts of the aircraft are performing as expected and desired.
[0003] [0003] During the use of the aircraft, periodic checks are made after a certain time or use. For example, a check can be done after approximately five to approximately 800 hours or approximately every three months or approximately 12 to approximately 18 months, depending on the type of inspection. Inspection on an aircraft may include a visual inspection of the exterior of an aircraft. In other cases, the inspection may involve removing different parts of the aircraft and inspecting those parts. Inspection can result in maintenance being performed on the aircraft.
[0004] [0004] Currently, these inspections are carried out by people using instructions that identify parts and inconsistencies that a person should look for. These people are also referred to as maintenance operators. The results of these inspections are written or entered into a database by the maintenance operator.
[0005] [0005] For example, in some inspections, an aircraft can be moved into a hangar. A maintenance operator can walk around the aircraft to determine if any inconsistencies are present on the aircraft's surface. These inconsistencies can include, for example, without limitation, a dent, a hole, missing rivets, or some other type of inconsistency.
[0006] [0006] This type of inspection requires greater amounts of time than desired. In addition, maintenance operators, who carry out inspections, need a level of training and experience that allows the identification of inconsistencies with a desired level of precision. The amount of time, knowledge, and experience required for maintenance operators results in a high cost of conducting aircraft inspections.
[0007] [0007] Therefore, it should be advantageous to have a method and apparatus that takes into account one or more of the issues discussed above, as well as other possible issues. SUMMARY
[0008] [0008] In an illustrative embodiment, a method for inspecting an object is provided. In response to the object's presence in an inspection area, a volume containing the object is identified. The volume has a plurality of parts. A number of sensor systems are designed for the plurality of parts of the volume. Each sensor system in the number of sensor systems is designed for a number of parts in the plurality of parts of the volume based on whether each sensor system is capable of generating data with a desired level of quality on a surface of the object in a particular part in the plurality of parties. Data on the object's surface is generated using the number of sensor systems designed for the plurality of parts of the volume. A determination is made as to whether a number of inconsistencies are present on the surface of the object using the data.
[0009] [0009] In another illustrative embodiment, a device comprises a number of sensor systems located in an inspection area and a computer system in communication with the number of sensor systems. The computer system is configured to identify a volume that contains an object. The volume has a plurality of parts. The computer system is configured to designate the number of sensor systems for the plurality of parts of the volume. Each sensor system in the number of sensor systems is designed for a number of parts in the plurality of parts of the volume based on whether each sensor system is capable of generating data with a desired level of quality on a surface of the object in a particular part in the plurality of parties. The computer system is configured to generate data on the object's surface using the number of sensor systems designed for the plurality of parts of the volume. The computer system is configured to determine if a number of inconsistencies are present on the object's surface.
[0010] [00010] The characteristics, functions, and advantages can be obtained independently in various modalities of the present disclosure or can be combined in still other modalities in which additional details can be seen with reference to the description and drawings below. BRIEF DESCRIPTION OF THE FIGURES
[0011] [00011] The characteristics that are believed to be the original characteristics of the illustrative modalities are listed in the appended claims. The illustrative modalities, however, as well as a preferred mode of use, additional objectives, and advantages thereof, will be better understood by reference to the detailed description below of an illustrative modality of the present disclosure when read together with the attached drawings, in which : Figure 1 is an illustration of an aircraft manufacturing and service method according to an illustrative embodiment; Figure 2 is an illustration of an aircraft in which an illustrative modality can be implemented; Figure 3 is an illustration of an inspection environment according to an illustrative embodiment; Figure 4 is an illustration of a data processing system according to an illustrative embodiment; Figure 5 is an illustration of a sensor system according to an illustrative embodiment; Figure 6 is an illustration of a test system according to an illustrative embodiment; Figure 7 is an illustration of a perspective view of an inspection environment according to an illustrative embodiment; Figure 8 is an illustration of an enlarged perspective view of part of an inspection environment in accordance with an illustrative embodiment; Figure 9 is an illustration of a front view of an inspection environment according to an illustrative embodiment; Figure 10 is an illustration of a top view of a volume in an inspection area according to an illustrative embodiment; Figure 11 is an illustration of a side view of a volume in an inspection area according to an illustrative embodiment; Figure 12 is an illustration of a perspective view of an inspection environment according to an illustrative embodiment; and Figure 13 is an illustration of a flow chart of a process for inspecting an object according to an illustrative embodiment. DETAILED DESCRIPTION
[0012] [00012] With reference more particularly to the drawings, disclosure modalities can be described in the context of the manufacturing and service method of aircraft 100 as shown in Figure 1 and aircraft 200 as shown in Figure 2. Moving first to Figure 1, a illustration of an aircraft manufacturing and service method according to an illustrative modality. During pre-production, the manufacturing and service method for aircraft 100 may include specification and design 102 for aircraft 200 in Figure 2 and material ratio 104.
[0013] [00013] During production, component manufacturing and subassembly 106 and system integration 108 of aircraft 200 take place in Figure 2. After that, aircraft 200 in Figure 2 can undergo certification and delivery 110 in order to be put into service. 112. While in service 112 by a customer, aircraft 200 in Figure 2 is scheduled for routine maintenance and service 114, which may include modification, reconfiguration, overhaul, and other maintenance or service.
[0014] [00014] Each of the manufacturing method and service processes of the aircraft 100 can be performed or performed by a system integrator, a third party, and / or an operator. In these examples, the operator can be a customer. For the purposes of this description, a system integrator may include, without limitation, any number of aircraft manufacturers and the main system subcontractor; a third party may include, without limitation, any number of vendors, subcontractors, and suppliers; and an operator can be an airline, leasing company, military entity, service organization, and so on.
[0015] [00015] With reference now to Figure 2, an illustration of an aircraft is shown in which an illustrative modality can be implemented. In this example, aircraft 200 is produced by manufacturing method and service of aircraft 100 in Figure 1 and may include fuselage 202 with a plurality of systems 204 and interior 206. Examples of systems 204 include one or more of propulsion system 208, system electric 210, hydraulic system 212, and environmental system 214. Any number of other systems can be included. Although an aerospace example is shown, different illustrative modalities can be applied to other industries, such as the automotive industry and / or the shipping industry.
[0016] [00016] Apparatus and methods covered in this document can be used during at least one of the stages of the aircraft manufacturing and service method 100 in Figure 1. As used in this document, the expression "at least one of", when used with a list of items means that different combinations of one or more of the items listed can be used and only one of each item in the list may be required. For example, "at least one of item A, item B, and item C" may include, for example, without limitation, item A or item A and item B. This example can also include item A, item B, and item C or item B and item C.
[0017] [00017] In an illustrative example, components or subassemblies produced in the manufacture and subassembly of component 106 in Figure 1 can be manufactured or manufactured in a similar manner to components or subassemblies produced while aircraft 200 is in service 112 in Figure 1. As yet another example, a number of device modalities, method modalities, or a combination of them can be used during the production stages, such as manufacturing and subassembly of component 106 and system integration 108 in Figure 1. A quantity, when reference to items means one or more items. For example, a number of apparatus modalities can be one or more apparatus modalities. A number of apparatus modalities, method modalities, or a combination thereof can be used while aircraft 200 is in service 112 and / or during maintenance and service 114 in Figure 1. The use of a number of different illustrative modalities can streamline substantially assembling and / or reducing the cost of the aircraft 200.
[0018] [00018] The different illustrative modalities recognize and take into account a number of considerations. For example, the different illustrative modalities recognize and take into account that aircraft inspection may not be as consistent as desired. As an illustrative example, different levels of experience and knowledge in maintenance operators can result in different maintenance operators identifying different inconsistencies in the same aircraft. In other words, a maintenance operator may not see an inconsistency that another maintenance operator may see, depending on the difference in knowledge and experience.
[0019] [00019] Even with the same knowledge and experience, the different illustrative modalities also recognize that maintenance operators may lose an inconsistency entirely or make a judgment that an inconsistency is not present. With the same aircraft, another maintenance operator can determine that the inconsistency is present.
[0020] [00020] The different illustrative modalities recognize and take into account that with maintenance operators carrying out inspections, it can be difficult to see the upper parts of an aircraft, such as the top of an aircraft. As a result, some inconsistencies may not be detected or identified by maintenance operators. A maintenance operator may be required to climb a ladder or use an elevator to view upper parts of an aircraft. The different illustrative modalities recognize and take into account that this type of process increases the time required to inspect the aircraft, as well as requires equipment that allows maintenance operators to see the highest parts of the aircraft that cannot be easily seen from the ground.
[0021] [00021] Therefore, the different illustrative modalities provide a method and apparatus for inspecting objects, such as an aircraft. In response to the presence of an object in an inspection area, a volume is identified that contains the object. This volume has a plurality of parts. A number of sensor systems are designed for the plurality of parts of the volume. Each sensor system in the number of sensor systems can be designed for a number of parts in the plurality of parts of the volume.
[0022] [00022] This designation of the number of sensors is based on whether each sensor is capable of generating data with a desired level of quality on a surface of the object in a particular part in the plurality of parts. Then data on the object's surface is generated using the number of sensor systems designed for the plurality of parts of the volume. A determination is made as to whether a number of inconsistencies are present on the object's surface. This information can then be used to perform maintenance and / or other operations on the object.
[0023] [00023] With reference now to Figure 3, an illustration of an inspection environment according to an illustrative modality is shown. In these illustrative examples, the inspection environment 300 can be used during different stages of the manufacturing and service method of the aircraft 100 in Figure 1.
[0024] [00024] Inspection environment 300 in Figure 3 is used to inspect object 302 for the amount of inconsistencies 304. In these illustrative examples, object 302 is aircraft 306. Aircraft 306 can be implemented using, for example, aircraft 200 in Figure 2. In these illustrative examples, the amount of inconsistencies 304 may include, for example, without limitation, at least one of a dent, a crack, a hole, and / or some other types of inconsistency.
[0025] [00025] In these illustrative examples, inspection of aircraft 306 takes place at location 308. In particular, location 308 can be in hangar 310 in these examples. Location 308 in hangar 310 makes up inspection area 312 to inspect aircraft 306.
[0026] [00026] The number of sensor systems 314 is associated with inspection area 312 in these illustrative examples. In these examples shown, the number of sensor systems 314 can include a mobile sensor system 315. The mobile sensor system 315 is configured to move across floor 311 or through air 313 in inspection area 312 in hangar 310.
[0027] [00027] The number of sensor systems 314 can be placed at locations 317 in hangar 310 so that substantially the entire surface 321 of object 302 can be detected by the number of sensor systems 314. In this way, the different illustrative modalities provide a capacity for inspect the entire object 302 more thoroughly when compared to methods used today. This type of improvement can be especially evident when object 302 takes the form of aircraft 306.
[0028] [00028] The computer system 316, in these illustrative examples, is in communication with the number of sensor systems 314. The computer system 316 communicates with the number of sensor systems 314 through the network 318. The network 318 can include communication links by wireless communication links, or a combination of the two.
[0029] [00029] In these illustrative examples, the computer system 316 comprises the number of computers 320. The number of computers 320 can be in communication with each other via network 318 or on different networks, depending on the implementation of the particular action.
[0030] [00030] Inspection process 322 occurs on a number of one or more computers 320. In other words, inspection process 322 can be distributed among different computers on the number of computers 320. Additionally, inspection process 322 can occur as program code, hardware, or a combination of the two in the number of computers 320. In these illustrative examples, number of sensor systems 314 generate data 324, which are sent to the inspection process 322.
[0031] [00031] In these illustrative examples, inspection process 322 identifies volume 326 in response to the presence of an object 302 in inspection area 312. This initiation of inspection process 322 can be carried out automatically in response to the presence of object 302. In other illustrative examples, inspection process 322 can begin by inspecting object 302 when object 302 is present in inspection area 312 and input is received to initiate inspection. This entry can be user input or some other suitable type of entry.
[0032] [00032] Volume 326 contains object 302. In other words, object 302 is located within volume 326. Inspection process 322 determines the number of sensor systems 314 for the plurality of 328 parts of volume 326. The designation of number of sensor systems 314 for the plurality of parts 328 is based on each sensor system being able to generate data 324 with the desired quality level 332 on the surface 321 of object 302 in particular part 336 in the plurality of parts 328.
[0033] [00033] In these illustrative examples, the data 324 generated by the number of sensor systems 314 takes the form of number of images 338. The number of images 338 can include still images, images for a video, a combination of the two, or some other type suitable image.
[0034] [00034] In these illustrative examples, the number of images 338 can be made by the number of sensor systems 314 using visible light, infrared light, and / or other suitable types of light. In addition, the number of images 338 can also be generated by a laser beam directed at the surface 321 of the object 302 with the data324 forming measurements on the distance to the surface 321 to generate images in the number of images 338. Of course, they can be used other types of images, depending on the implementation of the particular action.
[0035] [00035] In these represented examples, inspection process 322 compares data 324 with reference data 340 in database 342. Reference data 340 is obtained for object 302 at a time prior to the generation of data 324. In in other words, reference data 340 is obtained for object 302 at a time prior to inspection of object 302 for the number of inconsistencies 304.
[0036] [00036] Reference data 340 can take the form of quantity of images 344 generated after object 302 has been manufactured. In other examples, the number of images 344 can be images of object 302 taken prior to current use of object 302. In still other illustrative examples, reference data 340 can be generated from a model of object 302.
[0037] [00037] Inspection process 322 determines whether the amount of inconsistencies 304 is present on surface 321 of object 302 by comparing data324 with reference data 340. For example, inspection process 322 can compare data324 with data from reference 340 to identify the number of pixel locations 341 in the number of images 338 where the data324 does not correspond to the reference data 340 within a selected limit. In this way, the number of inconsistencies 304 is identified in the number of pixel locations 341 in the number of images 338.
[0038] [00038] In these represented examples, each pixel location in the number of pixel locations 341 is defined using an x-y coordinate system for the pixels in the image with the pixel location. The inspection process 322 identifies the locations on the surface 321 of the aircraft 306 that correspond to the locations in the number of images 338. In this way, the inspection process 322 identifies the number of locations 348 that correspond to the number of pixel locations 341. The quantity of locations 348 includes the actual locations on surface 321 of aircraft 306 for the number of inconsistencies 304.
[0039] [00039] The comparison between data324 and reference data 340 can be made using a number of different techniques. For example, at least one of image segmentation, border detection, image enhancement, geometric pattern matching, wavelet transformation, graph-based algorithms, and other suitable techniques are used to compare data324 to reference data 340.
[0040] [00040] In response to a determination that a number of inconsistencies are present on surface 321 of object 302, inspection process 322 can identify the number of maintenance operations 346 to perform on object 302. These maintenance operations may include, for example, without limitation, replacement of parts, redesign of parts, additional inspections, and / or other suitable types of maintenance operations.
[0041] [00041] For example, inspection process 322 can control test system 350 to perform additional inspections on the number of locations 348 where the number of inconsistencies 304 has been identified. In these illustrative examples, test system 350 can include a number of mobile test systems 354. The number of mobile test systems 354 can move between the number of locations 348 to perform additional inspections on the number of inconsistencies 304. In these illustrative examples , the number of mobile test systems 354 performs non-destructive tests 356 in the number of locations 348 where the number of inconsistencies 304 has been identified.
[0042] [00042] In these represented examples, the 356 non-destructive tests include a number of different types of testing techniques that no longer generate inconsistencies or cause unwanted changes to the 302 object. For example, the 356 non-destructive tests can include at least one of the tests using ultrasound signals, magnetic particles, liquid penetration, X-rays, eddy currents, and / or other suitable techniques to perform additional inspection of the object 302.
[0043] [00043] In this way, the different illustrative modalities provide an improved method and apparatus over current inspection systems to identify inconsistencies in objects, such as aircraft. In these illustrative examples, time and effort can be saved for objects, such as aircraft 306. In particular, inspection of object 302 in the form of aircraft 306 can be performed quickly and more accurately using the number of sensor systems 314 and the process of 322 inspection than using human maintenance operators.
[0044] [00044] The inspection environment illustration 300 in Figure 3 is not intended to mean physical or architectural limitations for a way in which different illustrative modalities can be implemented. Other components can be used in addition and / or in place of those illustrated. Some components may be unnecessary in some illustrative modalities. The blocks are also presented to illustrate some functional components. One or more of these blocks can be combined and / or divided into different blocks when implemented in different illustrative modalities.
[0045] [00045] For example, different illustrative modalities can be applied to different objects of 306 aircraft. For example, different illustrative modalities can be applied to other types of objects, such as, for example, without limitation, a structure based on the ground , a water-based structure, a space-based structure, and / or some other suitable type of object. More specifically, the different illustrative modalities can be applied to, for example, without limitation, a submarine, a bus, a personal transporter, tank, a train, an automobile, a space vehicle, a space station, a satellite, a space ship. surface, a power plant, a dam, an engine, a flap, a part of a fuselage, a manufacturing facility, a building, and / or some other suitable object.
[0046] [00046] Additionally, these inspections can be carried out at different times in addition to carrying out maintenance on an aircraft. For example, different illustrative modalities can be applied to parts manufactured for the 306 aircraft and during 306 aircraft testing and certification. In addition, the different illustrative modalities can be applied to inspecting the interior of an aircraft. For example, the number of sensor systems 314 may be present inside aircraft 306 or located on mobile platforms that move within aircraft 306 to inspect the interior surface of aircraft 306.
[0047] [00047] Moving now to Figure 4, an illustration of a data processing system according to an illustrative modality is shown. In this illustrative example, data processing system 400 is an example of an implementation action for one or more computers in the number of computers 320 in computer system 316 in Figure 3.
[0048] [00048] As shown, data processing system 400 includes communications structure 402, which provides communication between processor unit 404, memory 406, persistent storage 408, communications unit 410, input / output (I / O) unit 412 , and screen 414. Data processing system 400 is an example of a data processing system that can be used to implement the number of computers 320 in computer system 316 in Figure 3.
[0049] [00049] Processor unit 404 is used to execute instructions for software that can be loaded into memory 406. Processor unit 404 can be a number of processors, a multiprocessor core, or some other type of processor, depending on the implementation of a particular action. A quantity, as used in this document with reference to an item, means one or more items. In addition, processor unit 404 can be implemented using a number of heterogeneous processor systems in which a main processor is present with secondary processors in a single chip (chip). As another illustrative example, processor unit 404 may be a symmetric multiprocessor system containing multiple processors of the same type.
[0050] [00050] Memory 406 and persistent storage 408 are examples of storage devices 416. A storage device is any piece of hardware that is capable of storing information, such as, for example, without limitation, data, shaped program code functional, and / or other appropriate information or on a temporary basis and / or a permanent basis. Storage devices 416 can also be referred to in these examples as computer-readable storage devices. The memory 406, in these examples, can be, for example, a random access memory or any other suitable volatile or non-volatile storage device. Persistent storage 408 can take many forms, depending on the particular implementation.
[0051] [00051] For example, persistent storage 408 can contain one or more components or devices. For example, the 408 persistent storage can be a hard disk, a flash memory, a rewritable optical disc, a rewritable magnetic tape, or some combination of the above. The media used for 408 persistent storage can also be removable. For example, a removable hard drive can be used for 408 persistent storage.
[0052] [00052] Communications unit 410, in these examples, provides communications with other data processing systems or devices. In these examples, the communications unit 410 is a network interface card. Communications unit 410 can provide communications through the use of either or both physical and wireless communication links.
[0053] [00053] The input / output unit 412 allows the input and output of data with other devices that can be connected to the data processing system 400. For example, the input / output unit 412 can provide a connection for input of the user via a keyboard, a pointing device (mouse), and / or some other suitable input device. In addition, the input / output unit 412 can output to a printer. Screen 414 provides a mechanism for displaying information to a user.
[0054] [00054] The instructions for the operating system, applications, and / or programs can be located on storage devices 416, which are in communication with the processor unit 404 through communications structure 402. In these illustrative examples, the instructions are a functional form in persistent storage 408. These instructions can be loaded into memory 406 for execution by processor unit 404. The processes of the different modalities can be performed by processor unit 404 using instructions implemented by computer, which can be located in a memory , such as memory 406.
[0055] [00055] These instructions are referred to as program code, computer-usable program code, or computer-readable program code that can be read and executed by a processor in the 404 processing unit. The program code in the different modalities can be incorporated on different physical or computer-readable storage media, such as 406 memory or 408 persistent storage.
[0056] [00056] Program code 418 is located in a functional form on computer-readable media 420 which is selectively removable and can be loaded or transferred to data processing system 400 for execution by processor unit 404. The program code 418 and computer-readable media 420 create a computer program product 422 in these examples. In one example, computer-readable media 420 can be computer-readable storage media 424 or computer-readable signal media 426. Computer-readable storage media 424 can include, for example, an optical or magnetic disk that is inserted or placed on a trigger or other device that is part of the 408 persistent storage for transfer to a storage device, such as a hard disk, that is part of the 408 persistent storage.
[0057] [00057] Computer-readable storage media 424 can also take the form of persistent storage, such as a hard disk, a portable memory (thumb drive), or a flash memory, which is connected to the data processing system 400 In some cases, computer-readable storage media 424 may not be removable from data processing system 400. In these illustrative examples, computer-readable storage media 424 is a non-transitory, computer-readable storage medium.
[0058] [00058] Alternatively, a program code 418 can be transferred to data processing system 400 using computer-readable signal media 426. Computer-readable signal media 426 can be, for example, a propagated data signal that contains program code 418. For example, computer-readable signal media 426 can be an electromagnetic signal, an optical signal, and / or any other suitable signal type. These signals can be transmitted over communication links, such as wireless communication links, fiber optic cable, coaxial cable, a wire, and / or any other type of suitable communication link. In other words, the communication link and / or the connection can be physical or wireless in the illustrative examples.
[0059] [00059] In some illustrative embodiments, program code 418 can be downloaded over a network for persistent storage 408 from another device or data processing system via computer-readable signal media 426 for use in the processing system 400. For example, program code stored on a computer-readable storage medium in a server data processing system can be downloaded over a network from the server to the data processing system 400. The data system The data processing that provides program code 418 can be a server computer, a client computer, or some other device capable of storing and transmitting program code 418.
[0060] [00060] The different components illustrated for the data processing system 400 are not intended to provide architectural limitations to the way in which different modalities can be implemented. The different illustrative modalities can be implemented in a data processing system that includes additional components or in place of those illustrated for the data processing system 400. Other components shown in Figure 4 can be varied from the illustrative examples shown.
[0061] [00061] The different modalities can be implemented using any hardware device or system capable of executing program code. As an example, the data processing system may include organic components integrated with inorganic components and / or may be comprised entirely of organic components excluding a human being. For example, a storage device can be comprised of an organic semiconductor.
[0062] [00062] In another illustrative example, the processor unit 404 may take the form of a hardware unit that has circuits that are manufactured or configured for a particular use. This type of hardware can perform operations without requiring program code to be loaded into the memory of a storage device to be configured to perform operations.
[0063] [00063] For example, when processor unit 404 takes the form of a hardware unit, processor unit 404 may be a system circuit, an application specific integrated circuit (ASIC), a programmable logic device, or some other type of appropriate hardware configured to perform a number of operations. With a programmable logic device, the device is configured to perform the number of operations. The device can be reconfigured at a later time or it can be permanently configured to perform the number of operations.
[0064] [00064] Examples of programmable logic devices include, for example, a programmable logic set, a field programmable logic set, a field programmable gate set, and other suitable hardware devices. With this type of implementing action, the program code 418 can be omitted due to the processes for the different modalities to be implemented in a hardware unit.
[0065] [00065] In yet another illustrative example, the processor unit 404 can be implemented using a combination of processors found in computers and hardware units. Processor unit 404 can have a number of hardware units and a number of processors that are configured to run program code 418. With this example depicted, some of the processes can be implemented in the number of hardware units, while other processes can be implemented in the number of processors.
[0066] [00066] As another example, a storage device in a 400 data processing system is any hardware device that can store data. Memory 406, persistent storage 408, and computer-readable media 420 are examples of storage devices in a tangible form.
[0067] [00067] In another example, a bus system can be used to implement communications structure 402 and can be comprised of one or more buses, such as a system bus or an input / bus bus. Of course, the bus system can be implemented using any suitable type of architecture that provides a transfer of data between different components or devices connected to the bus system. In addition, the communications unit may include one or more devices used to transmit and receive data, such as a modem or network adapter. In addition, a memory can be, for example, memory 406, or a high speed memory (cache), as found in an interface hub and memory controller that can be present in communications structure 402.
[0068] [00068] With reference now to Figure 5, an illustration of a sensor system according to an illustrative modality is shown. In this illustrative example, the sensor system 500 is an example of a sensor system that can be used to implement a sensor system in the number of sensor systems 314 in Figure 3.
[0069] [00069] As shown in this example, the sensor system 500 comprises a number of cameras 502. The number of cameras 502 is configured to generate data 504 when creating the number of images 506. The number of images 506 can be, for example, without limitation, at least one of 508 still images, 510 video, and / or other suitable image types.
[0070] [00070] In these illustrative examples, the number of cameras 502 can generate the number of images 506 for area 512. The number of cameras 502 can be fixed or can be movable over the number of geometric axes 514.
[0071] [00071] This movement over the number of geometrical axes 514 is controlled through the motor system 516 and controller 518. Additionally, the movement over the number of geometrical axes 514 can be referred to as horizontal and vertical rotation in these illustrative examples.
[0072] [00072] Although the number of cameras 502 may be able to generate the number of images 506 over area 512, data 504 can be generated for only part 520 of area 512. Part 520 of area 512 can provide the quantity of images 506 with the desired quality level 522.
[0073] [00073] In these illustrative examples, the desired quality level 522 takes the form of resolution 524. The resolution of a camera in the number of cameras 502 can be measured in pixels and is a measure of the quality of an image. The quality of an image can be based on characteristics, such as, for example, without limitation, sharpness, color intensity, color contrast, distortion, compression, noise, dynamic range, and / or other suitable characteristics. As an illustrative example, when the resolution of an image increases, characteristics, such as the sharpness of an image and the ability to understand the objects in an image, also increase.
[0074] [00074] With reference now to Figure 6, an illustration of a test system according to an illustrative modality is shown. In this illustrative example, test system 600 is an example of an implementation for test system 350 in Figure 3.
[0075] [00075] In this example shown, test system 600 is mobile test system 602. As illustrated, mobile test system 602 comprises platform 604, propulsion system 606, controller 608, and non-destructive testing unit 610. A Platform 604 provides a framework for other components in the mobile test system 602. The propulsion system 606, controller 608, and non-destructive testing unit 610 are associated with platform 604.
[0076] [00076] The propulsion system 606 is configured by moving the mobile test system 602. The propulsion system 606 can move the mobile test system 602 on the ground, in the air, or a combination of the two.
[0077] [00077] For example, propulsion system 606 may comprise engine 612 and tracking system 614. Engine 61 causes tracking system 614 to move platform 604 on the ground. In other illustrative examples, the propulsion system 606 may comprise motor 616 and blades 618. Motor 616 is configured to rotate blades 618 to provide an elevation in the movement of mobile test system 602.
[0078] [00078] The non-destructive testing unit 610 may comprise at least one of the X-ray system 620, eddy current test system 622, ultrasound system 624, camera system 626, and / or other types of testing systems suitable non-destructive. In this illustrative example, the X-ray system 620 can be configured to generate images using X-rays. The eddy current test system 622 can be used to detect inconsistencies in conductive materials through electromagnetic induction. The 624 ultrasound system can be configured to send signals through materials to identify inconsistencies.
[0079] [00079] The camera system 626 can have a higher resolution than cameras in the number of sensor systems 314 in Figure 3. By moving the camera system 626 to a location of a detected inconsistency, more details of the inconsistency can be identified. In this way, the 626 camera system can be used to perform additional inspection of the detected inconsistency.
[0080] [00080] Controller 608 can be a data processing system, such as data processing system 400 in Figure 4, or a processing unit. Controller 608 is configured to control the mobile test system 602. For example, controller 608 can control the movement of the mobile test system 602. Additionally, controller 608 can control the generation of data by the non-destructive test unit 610. The movement and data generated by the mobile test system 602 can be controlled through instructions or commands received from the inspection process 322 in Figure 3.
[0081] [00081] Referring now to Figure 7, an illustration of a perspective view of an inspection environment according to an illustrative embodiment is shown. In this illustrative example, inspection environment 700 is an example of an implementation for inspection environment 300 in Figure 3.
[0082] [00082] As shown, inspection environment 700 includes inspection area 702 and aircraft 704 in inspection area 702. Inspection area 702 is located in hangar 706 in this illustrative example. As illustrated, sensor systems 708 are located in inspection area 702. Sensor systems 708 are configured to generate data on surface 710 of aircraft 704. In some illustrative examples, sensor systems 708 can be configured to generate data on other parts of aircraft 704, such as internal parts of aircraft 704. For example, sensor systems 708 may include X-ray systems.
[0083] [00083] In this illustrative example, sensor systems 708 include camera system 714, 716, 718, 720, 722, 724, 726, 728, 730, 732, 734, 735, 736, 762, 764, and 766. These camera systems can be implemented using, for example, a camera in the number of cameras 502 in Figure 5. In this illustrative example, these camera systems generate images for the surface 710 of the aircraft 704. In particular, these camera systems generate video.
[0084] [00084] The camera systems 714, 716, 718, 720, 722, 724, 726, 728, 730, 732, 734, 735, and 736 are located in hangar 706 in this example shown. For example, camera systems 714, 716, 718, 720, 722, 724, 726, 728, 730, 732, 734, 735, and 736 are located at locations 738, 740, 742, 744, 746, 748, 750, 752, 754, 756, 758, 759, and 760, respectively, in hangar 706. These locations are fixed locations for the camera systems represented in this example.
[0085] [00085] As illustrated, the sensor systems 708 also include the camera system 762, 764, and 766. The camera systems 762, 764, and 766 are connected to robots 768, 770, and 772, respectively. These robots allow camera systems 762, 764, and 766 to move within inspection area 702.
[0086] [00086] For example, robot 768 and robot 770 are configured to move camera system 762 and camera system 764, respectively, on a surface. This surface can be, for example, floor 774 of hangar 706 or surface 710 of aircraft 704. For example, robot 768 is located at location 778 on the surface of an aircraft wing 704.
[0087] [00087] In this illustrative example, robot 772 is configured to move camera system 766 through air 776 in hangar 706. In other words, robot 772 flies so that camera system 766 moves through air 776. In some illustrative examples, robot 772 can be configured to pick up, transport, and deploy a robot, such as robot 768, with camera system 762 on surface 710 of aircraft 704. In this way, camera systems 762, 764, and 766 are able to move to different locations within inspection area 702 to generate images of different surface parts 710 of aircraft 704.
[0088] [00088] The images generated by the 708 sensor systems can be sent to a computer system, such as computer system 316 in Figure 3, for processing. The images can be used to determine whether inconsistencies are present on surface 710 of aircraft 704. Part 780 of inspection environment 700 is illustrated in an enlarged view in Figure 8 below.
[0089] [00089] Now moving on to Figure 8, an illustration of an enlarged perspective view of part of an inspection environment is shown according to an illustrative modality. In this illustrative example, part 780 of inspection environment 700 of Figure 7 is shown.
[0090] [00090] As shown, camera system 720 has field of view 800. Camera system 728 has field of view 802. Camera system 734 has field of view 804. Additionally, camera system 764 has the field of view 806. The locations of the camera systems 720, 728, 734, and 764 allow images to be generated for different parts of the surface 710 of the aircraft 704.
[0091] [00091] With reference now to Figure 9, an illustration of a front view of an inspection environment is shown in accordance with an illustrative modality. In this illustrative example, the inspection environment 700 of Figure 7 is represented from a front view of the inspection area 702 and aircraft 704 in the inspection area 702.
[0092] [00092] Moving now to Figure 10, an illustration of a top view of a volume in an inspection area is shown according to an illustrative modality. In this illustrative example, volume 1000 is identified within inspection area 702 in inspection environment 700 of Figure 7.
[0093] [00093] In this illustrative example, volume 1000 comprises the plurality of parts 1002. The plurality of parts 1002 is selected to cover substantially the entire surface 710 of aircraft 704. In other words, aircraft 704 is contained within plurality of parts 1002 .
[0094] [00094] As illustrated, each camera system in the 708 sensor systems is assigned to a number of parts within the plurality of parts 1002. In this way, each camera system generates images for the surface 710 of the aircraft 704 within the field of view of each camera system in the number of parts assigned to each camera system. As an illustrative example, camera system 716 is designated for parts 1004, 1006, 1008, 1010, 1012, 1014, 1016, and 1018.
[0095] [00095] Additionally, in this represented example, each camera system is assigned to the number of parts in the plurality of parts 1002 based on whether each camera system is capable of generating images with a desired quality level of the surface 710 of the aircraft 704 in a particular part in the plurality of parts 1002. The quality of the images generated by each camera system may depend on the distance of each surface camera system 710 of aircraft 704.
[0096] [00096] In this illustrative example, camera system 735 and camera system 736 in Figure 7 are not shown to provide a clearer view of the plurality of parts 1002 in volume 1000. Camera system 735 in Figure 7 is designed for part 1016 and part 1022 in the plurality of parts 1002 in this example shown. Additionally, camera system 736 in Figure 7 is assigned to part 1018 and part 1020.
[0097] [00097] Referring now to Figure 11, an illustration of a side view of a volume in an inspection area according to an illustrative embodiment is shown. In this illustrative example, a side view of volume 1000 is shown in Figure 10. As shown, only a part of the plurality of parts 1002 of Figure 10 is shown. Referring now to Figure 12, an illustration of a perspective view of an inspection environment according to an illustrative embodiment is shown. In this illustrative example, the inspection environment 700 of Figure 7 is shown having the rail system 1200.
[0098] [00098] As shown, the rail system 1200 includes rails 1202, 1204, 1206, 1208, 1210, 1212, 1214, 1216, 1218, 1220, 1222, and 1224. The camera systems 714, 716, 718, 720 , 722, 724, 726, 728, 730, 732, and 734 are configured to move in the direction of arrow 1227 along rails 1202, 1204, 1206, 1208, 1210, 1212, 1214, 1216, 1218, 1220, 1222 , and 1224, respectively. Camera system 735 and camera system 736 are configured to move in the direction of arrow 1226 along track 1224.
[0099] [00099] In this way, the locations of these camera systems can be changed. The locations of the camera systems can be changed to reassign the camera system to different parts of the identified volume within an inspection area, such as volume 1000 in Figure 10. The locations of the camera systems can also be changed to take into account counts the size and / or shape of different aircraft and / or other structures located within inspection area 702 in hangar 706.
[0100] [000100] With reference now to Figure 13, an illustration of a flowchart of a process for inspecting an object according to an illustrative modality is shown. The process illustrated in Figure 13 can be implemented in the inspection environment 300 in Figure 3.
[0101] [000101] The process begins by identifying a volume that contains an object in response to an object's presence in an inspection area (operation 1300). The volume has a plurality of parts. The volume can be, for example, volume 1000 with the plurality of parts 1002 in Figure 10.
[0102] [000102] The process then designates a number of sensor systems for the plurality of parts of the volume (operation 1302). Each sensor system in the number of sensor systems is assigned to a number of parts in the plurality of parts of the volume based on whether each sensor system is capable of generating the data with a desired level of quality on a surface of the object in a particular part in the plurality of parts. In this illustrative example, the number of sensor systems can be the camera systems configured to generate the data in the form of still images and / or video.
[0103] [000103] After that, the process generates data on the object's surface using the number of sensor systems designated for the plurality of parts of the volume (operation 1304). The process then determines whether a number of inconsistencies are present on the object's surface using the data (operation 1306). Operation 1306 is performed by comparing the data to the reference data, such as the reference data 340 in Figure 3, in these examples.
[0104] [000104] If a number of inconsistencies are not present on the object's surface, the process ends. Otherwise, if a number of inconsistencies are present on the object's surface, the process identifies a number of maintenance operations to be performed on the object (operation 1308), with the process ending after that.
[0105] [000105] The number of maintenance operations may include, for example, reshaping the surface of the object, repairing the surface of the object, replacing a part associated with the surface of the object, performing additional inspection of the amount of inconsistencies, and / or other operations appropriate. In this illustrative example, operation 1308 can also include starting the number of identified maintenance operations. For example, in operation 1308, if additional inspection of the number of inconsistencies is identified, the process can send commands to a mobile test system to send the mobile test system for the number of inconsistencies.
[0106] [000106] The flowchart and block diagram in the different modalities represented illustrate the architecture, functionality, and operation of some possible implementations of apparatus and methods in different illustrative modalities. In this regard, each block in the flowchart or block diagram can represent a module, segment, function, and / or a part of an operation or step. In some alternative implementations, the function or functions observed in the block may occur out of the order observed in the Figures. For example, in some cases, two blocks shown in succession can be executed substantially concurrently, or the blocks can sometimes be executed in reverse order, depending on the functionality involved. Also, other blocks can be added in relation to the blocks illustrated in a flowchart or block diagram.
[0107] [000107] Therefore, the different illustrative modalities provide a method and apparatus for inspecting objects, such as aircraft. In response to the presence of an object in an inspection area, a volume is identified that contains the object. This volume has a plurality of parts. A number of sensor systems are designed for the plurality of parts of the volume. Each sensor system in the number of sensor systems can be assigned to a number of parts in the plurality of parts of the volume. This designation of the number of sensors is based on whether each sensor is capable of generating the data with a desired level of quality on an object surface in a particular part of the plurality of parts. Then data on the object's surface is generated using the number of sensor systems designed for the plurality of parts of the volume. A determination is made as to whether a number of inconsistencies are present on the object's surface. Then this information can be used to perform maintenance and / or other operations on the object.
[0108] [000108] In this way, the different illustrative modalities reduce the time, effort, and / or equipment needed to inspect an object, such as an aircraft. With the use of a number of sensor systems designed to generate the data with a desired quality for particular parts of a volume in an inspection area that contains the object, inspection of the object can be made easier, less time consuming, more accurate, and / or more consistent when compared to the currently available inspection methods.
[0109] [000109] The different illustrative modalities can take the form of an entirely hardware modality, an entirely software modality, or a modality that contains both hardware and software elements. Some modalities are implemented in software, which includes, but is not limited to, forms, such as, for example, firmware, resident software, and microcode.
[0110] [000110] In addition, the different modalities may take the form of a computer program product accessible from a computer-usable or computer-readable medium that provides the program code for use or in connection with a computer or any device or system that executes the instructions. For the purposes of this disclosure, a computer-usable or computer-readable medium can generally be any tangible device that may contain, store, communicate, propagate, or transport the program for use in or in connection with the system, device, or device. execution of instructions.
[0111] [000111] The usable or computer-readable medium can be, for example, without limitation, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, or a propagation medium. Non-limiting examples of a computer-readable medium include a semiconductor or solid-state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a hard magnetic disk , and an optical disc. Optical discs may include a read-only compact disc (CD-ROM), a read / write compact disc (CD-R / W), and a DVD.
[0112] [000112] Additionally, a computer-usable or computer-readable medium may contain or store a computer-readable or usable program code so that when the computer-readable or usable program code is executed on a computer, the execution of this code computer readable or usable program causes the computer to transmit another computer readable or usable program code over a communication link. This communication link can use a medium that is, for example, without limitation, physical or wireless.
[0113] [000113] A data processing system suitable for storing and / or executing computer-readable or computer-usable program code will include one or more processors coupled directly or indirectly to the memory elements through a communications structure, such as a bus of the system. The memory elements can include local memory used during the actual execution of the program code, mass storage memory, and high speed memory (cache), which provide temporary storage of at least some usable or usable program code per computer to reduce the amount of time that code can be retrieved from mass storage during code execution.
[0114] [000114] Input / output or I / O devices can be coupled to the system either directly or through intervening I / O controllers. These devices may include, for example, without limitation, keyboards, touch screens, and pointing devices. Different communication adapters can also be coupled to the system to enable data processing systems to be coupled to other data processing systems, printers, or remote storage devices over intervening private or public networks. Non-limiting examples are modems and network adapters and are just a few of the types of communication adapters available today.
[0115] [000115] The description of the different illustrative modalities was presented for the purposes of illustration and description and are not intended to be exhaustive or limited to the modalities in the revealed form. Many modifications and variations will be evident to individuals with common knowledge of the technique. In addition, different illustrative modalities can provide different advantages when compared to other illustrative modalities. The selected modality or modalities are chosen and described in order to better explain the principles of the modalities, the practical application, and to enable other individuals with common knowledge of the technique to understand the disclosure for various modalities with various modifications as being suitable for the particular use contemplated.
权利要求:
Claims (15)
[0001]
Method for inspecting an object (302, 704), in which the method is characterized by the fact that it comprises: responsive to the presence of the object (302, 704) in an inspection area (312, 702), identify in the inspection area (312, 702) a volume (326, 1000) containing the object (302, 704), in which the volume (326, 1000) has a plurality of parts (328, 1002); assign a number of sensor systems (314, 315, 500, 708) to the plurality of parts (328, 1002) of the volume (326, 1000), with each sensor system (314, 315, 500, 708) in the number of systems sensors is assigned to a number of parts (328) in the plurality of parts (328, 1002) of the volume (326, 1000) based on whether each sensor system (314, 315, 500, 708) is capable of generating data (324, 504) with a desired quality level (332, 522) on a surface (321, 710) of the object (302) in a particular part (336) in the plurality of parts (328, 1002); generate the data (324, 504) on the surface (321, 710) of the object (302, 704) using the number of sensor systems (314, 315, 500, 708) attributed to the plurality of parts (328, 1002) of the volume (326, 1000); and determine if a number of inconsistencies (304) are present on the surface (321, 710) of the object (302) using the data (324, 504).
[0002]
Method, according to claim 1, characterized by the fact that it still comprises: responsive to a determination that the number of inconsistencies (304) is present on the surface (321, 710) of the object (302, 704), identify a number of maintenance operations (3469) to perform on the object (302, 704).
[0003]
Method, according to claim 1, characterized by the fact that the number of sensor systems (314, 315, 500, 708) generates images (338) of the surface (321, 710) of the object (302, 704) and still comprises : responsive to a determination that the amount of inconsistencies (304) is present on the surface (321, 710) of the object (302, 704), send a mobile test system (354, 602) for the amount of inconsistencies (304); and perform non-destructive tests (356) at a location (308) for each of the number of inconsistencies (304) using the mobile test system (354, 602).
[0004]
Method, according to claim 1, characterized by the fact that it still comprises: sending the data (324, 504) on the surface (321, 710) of the object (302, 704) to a computer system (316); and wherein the step of determining whether the number of inconsistencies (304) is present on the surface (321, 710) of the object (302, 704) using the data (324, 504) comprises: determine, by the computer system (316), if the number of inconsistencies (304) is present on the surface (321, 710) of the object (302, 704) using the data (324, 504).
[0005]
Method according to claim 1, characterized by the fact that the step of determining whether the amount of inconsistencies (304) is present on the surface (321, 710) of the object (302, 704) using the data (324, 504) comprises: comparing the data (324, 504) to the reference data (340) to create a comparison; and determine if the amount of inconsistencies (304) is present on the surface (321, 710) of the object (302, 704) using the comparison.
[0006]
Method, according to claim 1, characterized by the fact that it still comprises: responsive to a determination that the amount of inconsistencies (304) is present on the surface (321, 710) of the object (302, 704), identify a number of locations (348) on the surface (321, 710) of the object (302, 704) for the number of inconsistencies (304).
[0007]
Method, according to claim 5, characterized by the fact that the reference data (340) are a number of images (338) of the surface (321, 710) of the object (302, 704) generated by performing at least one among generating the number of images (338) after making the object (302, 704), generating the number of images (338) using an object model (302, 704), and generating the number of images (338) before use current of the object (302, 704).
[0008]
Method according to claim 1, characterized by the fact that the number of sensor systems (314, 315, 500, 708) includes a mobile sensor system (315) and which further comprises: move the mobile sensor system (315) in the inspection area (312, 702) while data (324, 504) on the object's surface (321, 710) is generated (302, 704).
[0009]
Method, according to claim 1, characterized by the fact that it still comprises: selecting the plurality of parts (328, 1002) of the volume (326, 1000) so that data (324, 504) is generated for substantially the entire surface (321, 710) of the object (302, 704).
[0010]
Apparatus for inspecting an object, the apparatus characterized by the fact that it comprises: an inspection area (312, 702); a number of sensor systems (314, 315, 500, 708) located in the inspection area (312, 702); and a computer system (316) in communication with the number of sensor systems (314, 315, 500, 708) and configured, responsive to the presence of the object (302, 704) in the inspection area (312, 702), to identify in the area inspection (312, 702) a volume (326, 1000) containing the object (302, 704), wherein the volume (326, 1000) has a plurality of parts (328, 1002); designate the number of sensor systems (314, 315, 500, 708) for the plurality of parts (328, 1002) of the volume (326, 1000), in which each sensor system (314, 315, 500, 708) in the amount of sensor systems (314, 315, 500, 708) is assigned to a number of parts (328, 1002) in the plurality of parts (328) of the volume (326, 1000) based on whether each sensor system (314, 315, 500, 708) is capable of generating data (324, 504) with a desired quality level (332, 522) on a surface (321, 710) of the object (302, 704) in a particular part (336) in the plurality of parts ( 328, 1002); generate the data (324, 504) on the surface (321, 710) of the object (302, 704) using the number of sensor systems (315, 315, 500, 708) assigned to the plurality of parts (328, 1002) of the volume (326, 1000); and determining whether a number of inconsistencies (304) are present on the surface (321, 710) of the object (302, 704).
[0011]
Apparatus according to claim 10, characterized by the fact that the computer system (316) is further configured to identify a number of maintenance operations (346) to perform on the object in response to a determination that the amount of inconsistencies ( 304) is present on the surface (321, 710) of the object (302, 704).
[0012]
Apparatus according to claim 10, characterized by the fact that it further comprises: a mobile test system (354, 602), wherein the computer system (316) is further configured to send the mobile test system (354, 602) for the number of inconsistencies (304) in response to a determination that the number of inconsistencies (304) is present on the surface (321, 710) of the object (302, 704), and in which the mobile test system (354, 602) is configured to perform non-destructive tests (356) at a location ( 308) for each of the number of inconsistencies (304).
[0013]
Apparatus according to claim 12, characterized by the fact that the computer system (316) is further configured to identify the location (308) for each of the number of inconsistencies (304) using the data (324, 504) on the surface (321, 710) of the object (302, 704).
[0014]
Apparatus according to claim 10, characterized by the fact that when being configured to determine whether the amount of inconsistencies (304) is present on the surface (321, 710) of the object (302, 704) using the data (324, 504 ), the computer system (316) is configured to compare the data (324, 504) with the reference data (340) to create a comparison; and determining whether the amount of inconsistencies (304) is present on the surface (321, 710) of the object (302, 704) using the comparison.
[0015]
Apparatus according to claim 10, characterized by the fact that the number of sensor systems (314, 315, 500, 708) includes a mobile sensor system (315) in which the mobile sensor system (315) is configured to move in the inspection area (312, 702) while data (324, 504) is generated on the surface (321, 710) of the object (302, 704).
类似技术:
公开号 | 公开日 | 专利标题
BR112013008075B1|2021-04-06|METHOD AND APPARATUS TO INSPECT AN OBJECT
JP6911045B2|2021-07-28|Everyday scene restoration engine
JP6415034B2|2018-10-31|System and method for remote inspection of aircraft structures
ES2844203T3|2021-07-21|Composite material inspection and multi-layer structural verification
JP2015172576A5|2018-04-19|Method and system for inspection of complex item
US9610730B2|2017-04-04|Method and apparatus for reworking inconsistencies on parts
US9607370B2|2017-03-28|System and methods of inspecting an object
EP2639572A2|2013-09-18|Resin detection system
EP3270247B1|2020-03-04|System and method for generating enhanced stereographic videos of aircraft build processes
BR102016029296B1|2021-03-09|method for detecting local material changes in a composite structure
AU2019210607A1|2020-03-05|Automated supervision and inspection of assembly process
EP3312095B1|2021-01-13|Lightning strike inconsistency aircraft dispatch mobile disposition tool
AU2012200263B2|2014-10-09|Photogrammetry measurement system
JP2019035732A|2019-03-07|System and method for evaluating used component
AU2019210612A1|2020-03-05|Automated supervision and inspection of assembly process
AU2017225040B2|2022-03-10|Lightning strike inconsistency aircraft dispatch mobile disposition tool
JP2020149673A|2020-09-17|AI-based inspection in transportation
Dizaji et al.2022|Integrating visual sensing and structural identification using 3D-digital image correlation and topology optimization to detect and reconstruct the 3D geometry of structural damage
Kleiner et al.2009|Robocuprescue interleague challenge 2009: Bridging the gap between simulation and reality
CN114140612A|2022-03-04|Method, device, equipment and storage medium for detecting hidden danger of power equipment
同族专利:
公开号 | 公开日
RU2573487C9|2016-08-27|
WO2012047479A1|2012-04-12|
RU2573487C2|2016-01-20|
CA2807237C|2015-12-01|
RU2013119814A|2014-11-20|
CN103153792B|2016-04-20|
KR20130115215A|2013-10-21|
BR112013008075A2|2016-06-14|
EP2625105B1|2016-07-27|
US20120081540A1|2012-04-05|
JP2013543584A|2013-12-05|
US8982207B2|2015-03-17|
CN103153792A|2013-06-12|
AU2011312627A1|2013-02-21|
AU2011312627B2|2015-10-29|
JP5873500B2|2016-03-01|
KR101845104B1|2018-04-03|
CA2807237A1|2012-04-12|
EP2625105A1|2013-08-14|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US2849701A|1954-04-23|1958-08-26|Tele Dynamics Inc|Highway condition indicating system|
US4463428A|1981-10-26|1984-07-31|The United States Of America As Represented By The Secretary Of The Navy|Aircraft weight and center of gravity cockpit readout system|
JP2532059B2|1985-09-13|1996-09-11|日産自動車株式会社|Vehicle suspension control device|
DE3712513C2|1987-04-13|1993-01-21|Daimler-Benz Aktiengesellschaft, 7000 Stuttgart, De|
JP2636403B2|1989-03-08|1997-07-30|株式会社豊田自動織機製作所|Operation control device for unmanned vehicles|
FR2646396B1|1989-04-28|1991-07-05|Thomson Csf|FROST DETECTOR, PARTICULARLY FOR VEHICLE AND, MAINLY, FOR AIRCRAFT|
JP2679346B2|1990-03-28|1997-11-19|神鋼電機株式会社|Charging control method for mobile robot system|
US5237404A|1990-06-28|1993-08-17|Mazda Motor Corporation|Inspection apparatus with improved detection of surface defects over large and curved surfaces|
JP2884816B2|1991-05-10|1999-04-19|神鋼電機株式会社|Control method in mobile robot system|
US5318254A|1991-06-28|1994-06-07|Conceptual Solutions, Inc.|Aircraft maintenance robot|
US5416476A|1991-11-29|1995-05-16|Rendon; Edward|Method and system for detecting potential icy conditions on roads|
US5203646A|1992-02-06|1993-04-20|Cornell Research Foundation, Inc.|Cable crawling underwater inspection and cleaning robot|
JP2758526B2|1992-02-17|1998-05-28|株式会社日立製作所|Mobile inspection robot|
US5340056A|1992-02-27|1994-08-23|The State Of Israel, Ministry Of Defence, Rafael Armament Development Authority|Active defense system against tactical ballistic missiles|
US5324948A|1992-10-27|1994-06-28|The United States Of America As Represented By The United States Department Of Energy|Autonomous mobile robot for radiologic surveys|
US5487440A|1993-05-18|1996-01-30|Seemann; Henry R.|Robotic apparatus|
US5334982A|1993-05-27|1994-08-02|Norden Systems, Inc.|Airport surface vehicle identification|
US5586028A|1993-12-07|1996-12-17|Honda Giken Kogyo Kabushiki Kaisha|Road surface condition-detecting system and anti-lock brake system employing same|
JPH07280743A|1994-04-11|1995-10-27|Kubota Corp|Crevice monitoring method for concrete|
US5845002A|1994-11-03|1998-12-01|Sunkist Growers, Inc.|Method and apparatus for detecting surface features of translucent objects|
JP3376386B2|1995-01-09|2003-02-10|株式会社スギヤス|How to pass an unmanned car at an intersection|
JPH11501572A|1995-04-10|1999-02-09|ユナイテッドパーセルサービスオブアメリカ,インコーポレイテッド|Two-camera system that detects and stores the position of an index on a conveyed article|
IL113913A|1995-05-30|2000-02-29|Friendly Machines Ltd|Navigation method and system|
US5695155A|1995-09-21|1997-12-09|Hughes Aircraft Company|Resonator-based, surface-condition sensor|
US5832187A|1995-11-03|1998-11-03|Lemelson Medical, Education & Research Foundation, L.P.|Fire detection systems and methods|
ES2140138T3|1995-11-06|2000-02-16|Michel Cuvelier|ROAD MONITORING DEVICE.|
JP2976869B2|1995-12-28|1999-11-10|日産自動車株式会社|Surface defect inspection equipment|
AT213843T|1996-05-14|2002-03-15|Allied Signal Inc|RADAR-BASED GROUND AND OBSTACLE WARNING|
US5995884A|1997-03-07|1999-11-30|Allen; Timothy P.|Computer peripheral floor cleaning system and navigation method|
US6128951A|1997-04-15|2000-10-10|Trinity Airweighs, L.L.C.|Aircraft weight and center of gravity indicator|
US8060308B2|1997-10-22|2011-11-15|Intelligent Technologies International, Inc.|Weather monitoring techniques|
JPH11284982A|1998-03-31|1999-10-15|Chugoku Electric Power Co Inc:The|Monitoring device and method|
US6364026B1|1998-04-01|2002-04-02|Irving Doshay|Robotic fire protection system|
US6545601B1|1999-02-25|2003-04-08|David A. Monroe|Ground based security surveillance system for aircraft and other commercial vehicles|
US6947797B2|1999-04-02|2005-09-20|General Electric Company|Method and system for diagnosing machine malfunctions|
GB9918248D0|1999-08-04|1999-10-06|Matra Bae Dynamics Uk Ltd|Improvements in and relating to surveillance systems|
US6647356B2|1999-08-23|2003-11-11|General Electric Company|System and method for remote inbound vehicle inspection|
US6266138B1|1999-10-12|2001-07-24|Perceptron, Inc.|System and method for detecting defects in a surface of a workpiece|
DE10014076B4|2000-03-22|2004-12-09|Nolex Ag|Tire pressure display device|
US7296488B2|2000-04-20|2007-11-20|United States Of America As Represented By The Secretary Of The Army|System and method for accessing ferrous surfaces normally accessible only with special effort|
US6825758B1|2000-06-26|2004-11-30|Nokian Tyres Plc|System for detecting and communicating operational characteristics of tires telecommunicationally and a method therefor|
IL138695A|2000-09-26|2004-08-31|Rafael Armament Dev Authority|Unmanned mobile device|
AU1370602A|2000-10-26|2002-05-06|Triant Technologies Inc|Method for estimating and reducing uncertainties in process measurements|
US6532840B2|2000-12-19|2003-03-18|General Electric Company|Methods for robotically inspecting gas turbine combustion components|
US6606563B2|2001-03-06|2003-08-12|Honeywell International Inc.|Incursion alerting system|
US8145367B2|2001-03-06|2012-03-27|Honeywell International Inc.|Closed airport surface alerting system|
US6565361B2|2001-06-25|2003-05-20|John Zink Company, Llc|Methods and apparatus for burning fuel with low NOx formation|
WO2003022532A2|2001-09-09|2003-03-20|Advanced Robotic Vehicles, Inc.|Surface adhering tool carrying robot|
US6907799B2|2001-11-13|2005-06-21|Bae Systems Advanced Technologies, Inc.|Apparatus and method for non-destructive inspection of large structures|
GB2382708B|2001-11-21|2006-03-15|Roke Manor Research|Detection of foreign objects on surfaces|
US6671588B2|2001-12-27|2003-12-30|Toyota Jidosha Kabushiki Kaisha|System and method for controlling traveling direction of aircraft|
US20030135327A1|2002-01-11|2003-07-17|Seymour Levine|Low cost inertial navigator|
US7844364B2|2002-04-16|2010-11-30|Irobot Corporation|Systems and methods for dispersing and clustering a plurality of robotic devices|
US20040030571A1|2002-04-22|2004-02-12|Neal Solomon|System, method and apparatus for automated collective mobile robotic vehicles used in remote sensing surveillance|
US6898484B2|2002-05-01|2005-05-24|Dorothy Lemelson|Robotic manufacturing and assembly with relative radio positioning using radio based location determination|
US6799633B2|2002-06-19|2004-10-05|Halliburton Energy Services, Inc.|Dockable direct mechanical actuator for downhole tools and method|
AU2003262893A1|2002-08-21|2004-03-11|Neal Solomon|Organizing groups of self-configurable mobile robotic agents|
US6819265B2|2002-08-22|2004-11-16|Rosemount Aerospace Inc.|Advanced warning ice detection system for aircraft|
US6822624B2|2002-09-10|2004-11-23|Universal Avionics Systems Corporation|Display generation system|
US7505619B2|2002-09-27|2009-03-17|Kla-Tencor Technologies Corporation|System and method for conducting adaptive fourier filtering to detect defects in dense logic areas of an inspection surface|
WO2004032086A1|2002-10-04|2004-04-15|Fujitsu Limited|Robot system and autonomously traveling robot|
US7303010B2|2002-10-11|2007-12-04|Intelligent Robotic Corporation|Apparatus and method for an autonomous robotic system for performing activities in a well|
US20040073411A1|2002-10-15|2004-04-15|The Boeing Company|System, method and computer program product for maintaining a structure|
SI2078954T1|2002-12-10|2017-10-30|Chep Technology Pty Limited|Automated pallet inspection|
JP4079792B2|2003-02-06|2008-04-23|松下電器産業株式会社|Robot teaching method and robot with teaching function|
US6888446B2|2003-02-25|2005-05-03|Lear Corporation|Tire pressure monitoring auto location assembly|
US7639372B2|2003-03-07|2009-12-29|Dieter Gerlach|Scanning system with stereo camera set|
DE112004000508T5|2003-03-27|2006-02-09|Honda Motor Co., Ltd.|Method for applying protective layer-forming material|
JP2004301665A|2003-03-31|2004-10-28|Toshiba Plant Systems & Services Corp|Wall surface inspection robot system and wall surface inspection method|
PT1464920E|2003-04-03|2007-10-16|Erwin Pristner|Apparatus for detecting, determining and documenting damages, in particular deformations of painted surfaces caused by sudden events|
JP3882797B2|2003-08-08|2007-02-21|日産自動車株式会社|VEHICLE DRIVE OPERATION ASSISTANCE DEVICE AND VEHICLE HAVING VEHICLE DRIVE OPERATION ASSISTANCE DEVICE|
EP1668469A4|2003-09-19|2007-11-21|Bae Systems Advanced Informati|Tracking systems and methods|
JP2005132169A|2003-10-29|2005-05-26|Pacific Ind Co Ltd|Trigger device for tire condition monitoring apparatus|
RU2245825C1|2003-11-03|2005-02-10|Открытое акционерное общество "Ракетно-космическая корпорация "Энергия" им. С.П. Королева"|Automated testing system for optimization, electrical checks and preparation of spacecraft for launch|
TWI234961B|2003-11-25|2005-06-21|Kye Systems Corp|Control apparatus of household appliances for use in the wireless network|
US20050126794A1|2003-12-12|2005-06-16|Palmer Gerald R.|Fire prevention system|
US7417738B2|2004-01-27|2008-08-26|Tradewind Scientific Ltd.|Determining surface properties of a roadway or runway from a moving vehicle|
EP1761741A2|2004-02-06|2007-03-14|Icosystem Corporation|Methods and systems for area search using a plurality of unmanned vehicles|
DE102004006033B3|2004-02-06|2005-09-08|Eads Deutschland Gmbh|Method for detection and control of forest and wildfires|
US20050217589A1|2004-04-02|2005-10-06|Daniel Brace B|Agitation of poultry by applied robotics|
NO320851B1|2004-04-15|2006-02-06|Oddvard Johnsen|Brake control functions based on a control for variations in acceleration values in the horizontal plane of the wheel|
WO2005113261A1|2004-05-21|2005-12-01|Versa Tile Pty Ltd|System for obtaining and managing tyre information|
GB0415752D0|2004-07-14|2004-08-18|Security Processes Ltd|Inspection device|
BRPI0514602A|2004-08-30|2008-06-17|Messier Dowty Inc|deflection and structural load measuring device|
WO2006047266A1|2004-10-22|2006-05-04|Agrios, Inc.|Systems and methods for automated vehicle image acquisition, analysis, and reporting|
CA2487704A1|2004-11-18|2006-05-18|R. Kyle Schmidt|Method and system for health monitoring of aircraft landing gear|
CN100341023C|2005-02-06|2007-10-03|上海保隆汽车科技股份有限公司|Automobile tire monitoring system and tire identification recognition method|
US8078319B2|2005-02-16|2011-12-13|Lockheed Martin Corporation|Hierarchical contingency management system for mission planners|
US7765038B2|2005-02-16|2010-07-27|Lockheed Martin Corporation|Mission planning system for vehicles with varying levels of autonomy|
US7236861B2|2005-02-16|2007-06-26|Lockheed Martin Corporation|Mission planning system with asynchronous request capability|
US20060271251A1|2005-02-17|2006-11-30|Hopkins Jeffrey A|Unmanned vehicle control|
US7797095B2|2005-02-23|2010-09-14|Aviation Safety Technologies, Llc|Method and device of calculating aircraft braking friction and other relating landing performance parameters based on the data received from aircraft's on board flight data management system|
US8528400B2|2005-07-26|2013-09-10|Goodrich Corporation|Aircraft shock strut having a fluid level monitor|
JP5276778B2|2005-08-31|2013-08-28|株式会社ブリヂストン|Tire information management system|
US7499772B2|2005-08-31|2009-03-03|Honeywell International Inc.|Method and system for navigating a nondestructive evaluation device|
US7412899B2|2005-09-16|2008-08-19|International Electronic Machines Corporation|MEMS-based monitoring|
DE602005016438D1|2005-12-07|2009-10-15|Sap Ag|Method and system for automatically organizing and executing a predefined task by robot functionalities|
JP4734120B2|2006-01-06|2011-07-27|株式会社東芝|Aircraft body inspection method and apparatus|
WO2007080584A2|2006-01-11|2007-07-19|Carmel-Haifa University Economic Corp. Ltd.|Uav decision and control system|
US9195233B2|2006-02-27|2015-11-24|Perrone Robotics, Inc.|General purpose robotics operating system|
US7327112B1|2006-03-06|2008-02-05|Adrian Gregory Hlynka|Multiple leg tumbling robot|
CN101074937B|2006-05-19|2010-09-08|清华大学|Energy spectrum modulator, method and apparatus for discriminating material and image processing method|
US7501616B2|2006-05-25|2009-03-10|Microvision, Inc.|Method and apparatus for capturing an image of a moving object|
US8947531B2|2006-06-19|2015-02-03|Oshkosh Corporation|Vehicle diagnostics based on information communicated between vehicles|
US20080004749A1|2006-06-30|2008-01-03|Honeywell International, Inc.|System and method for generating instructions for a robot|
US8401726B2|2006-07-20|2013-03-19|The Boeing Company|Maintenance interval determination and optimization tool and method|
US7813888B2|2006-07-24|2010-10-12|The Boeing Company|Autonomous vehicle rapid development testbed systems and methods|
ES2325433B1|2006-07-31|2010-06-21|Airbus Operations, S.L.|ROBOT TREPADOR EQUIPPED WITH A WORK UNIT, AND GOVERNMENT TEAM OF SUCH ROBOTS TREPADORES.|
US7626513B2|2006-08-02|2009-12-01|The Boeing Company|Communication of landing conditions|
US7586422B2|2006-08-02|2009-09-08|The Boeing Company|Determination of runway landing conditions|
CN101162206B|2006-10-13|2011-01-05|同方威视技术股份有限公司|Mobile vehicle inspection system|
JP4876888B2|2006-12-15|2012-02-15|株式会社デンソー|Wheel position detecting device, tire air pressure detecting device and transmitter / receiver provided with the same|
JP5174034B2|2006-12-19|2013-04-03|エンジニアード・アレスティング・システムズ・コーポレーション|System and method for improving or increasing information, particularly information regarding runway conditions available to landing aircraft pilots|
US7860618B2|2006-12-21|2010-12-28|The Boeing Company|System, method and program product for predicting fleet reliability and maintaining a fleet of vehicles|
US8051547B2|2006-12-29|2011-11-08|The Boeing Company|Robot-deployed assembly tool|
US20100250022A1|2006-12-29|2010-09-30|Air Recon, Inc.|Useful unmanned aerial vehicle|
RU2335734C1|2007-01-22|2008-10-10|ЗАО "НИИИН МНПО "Спектр"|Surface defect detector and tester|
US7627447B2|2007-01-23|2009-12-01|The Boeing Company|Method and apparatus for localizing and mapping the position of a set of points on a digital model|
US20080270866A1|2007-04-26|2008-10-30|Infineon Technologies Ag|Transmission with automatic repeat request process|
US8260485B1|2007-04-26|2012-09-04|The Boeing Company|Adaptive multi-vehicle area coverage optimization system and method|
US7896113B1|2007-05-12|2011-03-01|Fernando Ramirez|Police robotic system|
US7796018B2|2007-06-04|2010-09-14|The Boeing Company|Method and apparatus for designing a health monitor system for a vehicle|
US7714702B2|2007-06-04|2010-05-11|The Boeing Company|Health monitoring system for preventing a hazardous condition|
US7743660B2|2007-06-15|2010-06-29|The Boeing Company|System and method for automated inspection of large-scale part|
US8655540B2|2007-08-20|2014-02-18|International Electronic Machines Corp.|Rail vehicle identification and processing|
US7894948B2|2007-11-01|2011-02-22|L-3 Communications Integrated Systems L.P.|Systems and methods for coordination of entities and/or communicating location information|
US8060270B2|2008-02-29|2011-11-15|The Boeing Company|System and method for inspection of structures and objects by swarm of remote unmanned vehicles|
US8643719B2|2008-02-29|2014-02-04|The Boeing Company|Traffic and security monitoring system and method|
US20090243828A1|2008-03-31|2009-10-01|General Motors Corporation|Vehicle email system and method|
FR2930669B1|2008-04-24|2011-05-13|Airbus France|DEVICE AND METHOD FOR DETERMINING A TRACK STATE, AIRCRAFT COMPRISING SUCH A DEVICE AND A PILOTAGE ASSISTANCE SYSTEM UTILIZING THE TRACK STATE|
US8255156B2|2008-05-19|2012-08-28|The Boeing Company|Spatial source collection and services system|
US8150105B2|2008-05-22|2012-04-03|International Electronic Machines Corporation|Inspection using three-dimensional profile information|
US20100023201A1|2008-07-24|2010-01-28|David Scott Kinney|Method and apparatus for obtaining vehicle data|
US20100039294A1|2008-08-14|2010-02-18|Honeywell International Inc.|Automated landing area detection for aircraft|
US8392045B2|2008-09-05|2013-03-05|The Boeing Company|System and methods for aircraft preflight inspection|
US20100271191A1|2008-10-07|2010-10-28|De Graff Bassel|Systems, devices, and methods utilizing stretchable electronics to measure tire or road surface conditions|
US8849477B2|2008-10-14|2014-09-30|Honeywell International Inc.|Avionics display system and method for generating three dimensional display including error-compensated airspace|
US8504505B2|2008-10-31|2013-08-06|Caterpillar Inc.|System and method for controlling an autonomous worksite|
US7945422B2|2009-01-16|2011-05-17|Honeywell International Inc.|Tire pressure augmented aircraft weight and balance system and method|
US9418496B2|2009-02-17|2016-08-16|The Boeing Company|Automated postflight troubleshooting|
US8812154B2|2009-03-16|2014-08-19|The Boeing Company|Autonomous inspection and maintenance|
US20100312387A1|2009-06-05|2010-12-09|The Boeing Company|Supervision and Control of Heterogeneous Autonomous Operations|
US9046892B2|2009-06-05|2015-06-02|The Boeing Company|Supervision and control of heterogeneous autonomous operations|
EP2462491B1|2009-08-03|2019-06-12|BAE SYSTEMS plc|Monitoring system|
RU94714U1|2010-01-11|2010-05-27|Общество с ограниченной ответственностью "Вотум"|NON-DESTRUCTIVE CONTROL OF OBJECTS|
US8773289B2|2010-03-24|2014-07-08|The Boeing Company|Runway condition monitoring|
US20110313614A1|2010-06-21|2011-12-22|Hinnant Jr Harris O|Integrated aeroelasticity measurement for vehicle health management|
US8712634B2|2010-08-11|2014-04-29|The Boeing Company|System and method to assess and report the health of landing gear related components|
US8599044B2|2010-08-11|2013-12-03|The Boeing Company|System and method to assess and report a health of a tire|
KR20140031163A|2010-09-29|2014-03-12|에어로보틱스, 인크.|Novel systems and methods for non-destructive inspection of airplanes|
US8982207B2|2010-10-04|2015-03-17|The Boeing Company|Automated visual inspection system|
US8509982B2|2010-10-05|2013-08-13|Google Inc.|Zone driving|
US20120261144A1|2011-04-14|2012-10-18|The Boeing Company|Fire Management System|US9541505B2|2009-02-17|2017-01-10|The Boeing Company|Automated postflight troubleshooting sensor array|
US9418496B2|2009-02-17|2016-08-16|The Boeing Company|Automated postflight troubleshooting|
US8812154B2|2009-03-16|2014-08-19|The Boeing Company|Autonomous inspection and maintenance|
US9046892B2|2009-06-05|2015-06-02|The Boeing Company|Supervision and control of heterogeneous autonomous operations|
US8773289B2|2010-03-24|2014-07-08|The Boeing Company|Runway condition monitoring|
US8599044B2|2010-08-11|2013-12-03|The Boeing Company|System and method to assess and report a health of a tire|
US8712634B2|2010-08-11|2014-04-29|The Boeing Company|System and method to assess and report the health of landing gear related components|
US8982207B2|2010-10-04|2015-03-17|The Boeing Company|Automated visual inspection system|
US8930042B2|2011-07-15|2015-01-06|The Boeing Company|Mobilized sensor network for structural health monitoring|
WO2013155039A1|2012-04-10|2013-10-17|Mahle Powertrain, Llc|Color vision inspection system and method of inspecting a vehicle|
US9533773B1|2012-04-30|2017-01-03|The Boeing Company|Methods and systems for automated vehicle asset tracking|
US9117185B2|2012-09-19|2015-08-25|The Boeing Company|Forestry management system|
US9285296B2|2013-01-02|2016-03-15|The Boeing Company|Systems and methods for stand-off inspection of aircraft structures|
US10725478B2|2013-07-02|2020-07-28|The Boeing Company|Robotic-mounted monument system for metrology systems|
EP3851383A3|2013-09-13|2021-11-10|JCAI Inc.|Method and system for coordinating removal of contamination from surface of aircraft|
FR3012425B1|2013-10-24|2017-03-24|European Aeronautic Defence & Space Co Eads France|COLLABORATIVE ROBOT FOR VISUAL INSPECTION OF AN AIRCRAFT|
FR3016963B1|2014-01-27|2016-01-22|Airbus|DEVICE AND METHOD FOR LOCATING IMPACTS ON AN OUTER SURFACE OF A BODY|
US9509960B2|2014-05-08|2016-11-29|Verizon Patent And Licensing Inc.|Vehicle anomaly detection and reporting|
US9371133B2|2014-11-07|2016-06-21|Paccar Inc|Drone systems for pre-trip inspection and assisted backing|
CA2976960A1|2015-02-23|2016-09-01|Bae Systems Australia Limited|An inspection system|
FR3035510B1|2015-04-21|2018-10-26|Airbus Group Sas|ACOUSTICAL MEANS FOR DETECTION, LOCATION AND AUTOMATIC EVALUATION OF IMPACTS SUBJECT TO A STRUCTURE|
GB2538231A|2015-05-07|2016-11-16|Airbus Operations Ltd|Method and apparatus for aircraft inspection|
FR3036101B1|2015-05-12|2019-05-10|Safran Nacelles|SYSTEM FOR VISUAL INSPECTION OF EQUIPMENT BEHIND AN AIRCRAFT FAIR, COMPRISING CAMERAS TRANSMITTING IMAGES OUTSIDE|
FR3037429B1|2015-06-15|2018-09-07|Donecle|SYSTEM AND METHOD FOR AUTOMATIC SURFACE INSPECTION|
DE102015120660A1|2015-11-27|2017-06-01|Airbus Defence and Space GmbH|Aircraft inspection system|
ITUA20161534A1|2016-03-10|2017-09-10|Wpweb Srl|PROCEDURE FOR ANALYZING AN AIRCRAFT, ITS ANALYSIS SYSTEM OF A AIRCRAFT AND ANTI-FREEZE AND CRUSHING SYSTEM|
JP6672075B2|2016-05-25|2020-03-25|キヤノン株式会社|CONTROL DEVICE, CONTROL METHOD, AND PROGRAM|
US10007981B2|2016-07-09|2018-06-26|Mountain Forge|Automated radial imaging and analysis system|
US10170011B2|2016-07-26|2019-01-01|International Business Machines Corporation|Guide drones for airplanes on the ground|
US10820574B2|2016-07-29|2020-11-03|International Business Machines Corporation|Specialized contextual drones for virtual fences|
US9987971B2|2016-07-29|2018-06-05|International Business Machines Corporation|Drone-enhanced vehicle external lights|
US11107030B2|2016-09-28|2021-08-31|Federal Express Corporation|Enhanced systems, apparatus, and methods for positioning of an airborne relocatable communication hub supporting a plurality of wireless devices|
US20180099761A1|2016-10-10|2018-04-12|The Boeing Company|Real Time Aircraft Stress Monitoring System|
US20180114302A1|2016-10-23|2018-04-26|The Boeing Company|Lightning strike inconsistency aircraft dispatch mobile disposition tool|
US20180187409A1|2017-01-04|2018-07-05|Thomas Williams|System and Method for Finding Dents on an Automobile using a Booth|
US11086315B2|2017-10-26|2021-08-10|2KR Systems, LLC|Building rooftop intelligence gathering, decision-support and snow load removal system for protecting buildings from excessive snow load conditions, and automated methods for carrying out the same|
US10969521B2|2017-10-26|2021-04-06|2KR Systems, LLC|Flexible networked array for measuring snow water equivalentand system network for providing environmental monitoring services using the same|
US10755582B2|2017-12-28|2020-08-25|Paypal, Inc.|Drone physical and data interface for enhanced distance coverage|
US10800550B2|2018-06-21|2020-10-13|The Boeing Company|Positioning enhancements to localization process for three-dimensional visualization|
US11188212B2|2020-03-19|2021-11-30|Panasonic Intellectual Property Management Co., Ltd.|Methods and systems for monitoring objects for image-inspection|
CN111846275B|2020-07-10|2021-11-23|杭州天为航空技术服务有限公司|Aircraft cleaning management system|
法律状态:
2018-12-26| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2019-11-26| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2021-02-02| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2021-04-06| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 15/09/2011, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
US12/897,158|2010-10-04|
US12/897,158|US8982207B2|2010-10-04|2010-10-04|Automated visual inspection system|
PCT/US2011/051830|WO2012047479A1|2010-10-04|2011-09-15|Automated visual inspection system|
[返回顶部]