![]() position and orientation determination using motion data
专利摘要:
"position and orientation determination using motion data". the invention relates to position determination systems and methods. a specific handheld device includes a calibration component to communicate with a local positioning system to determine an initial position and orientation of the handheld device within a local coordinate system associated with a target structure. the handheld device also includes at least one motion sensor to detect the handheld's movement. the portable device further includes a processor for determining a measured position and orientation of the portable device based on the initial position and orientation and orientation of the portable device within the local coordinate system and based on the detected movement of the portable device. 公开号:BR112012014802B1 申请号:R112012014802 申请日:2010-11-12 公开日:2020-04-07 发明作者:E Georgeson Gary;J Troy James;J Hellenbrand Peter;W Lea Scott;Motzer William 申请人:Boeing Co; IPC主号:
专利说明:
Descriptive Report of the Invention Patent for DETERMINATION OF POSITION AND ORIENTATION USING MOVEMENT DATA. BACKGROUND [0001] This description generally relates to the determination of position and orientation using motion data. [0002] The assembly and maintenance of large objects, such as repositionable structures, can be aided by the use of positioning systems to guide the placement of components. For example, airplanes can be assembled in installations that include positioning systems and equipment to measure the position of certain components of the airplane, such as wings and warping, to ensure proper placement of components. Other large-scale assembly facilities may use similar positioning systems. Certain position determination systems are limited to line of sight measurements. These position determination systems may not be useful for determining the position information in an interior or other occluded areas of the structure being assembled or maintained. For example, a laser-based system that is oriented to determine the position of an airplane's wings may not be able to also determine the positions of components within an airplane's cargo compartment. SUMMARY [0003] Position determination systems and methods are described. A specific method includes initializing a handheld device capable of detecting movement by storing initial position and orientation data in a handheld device's memory. Initial position and orientation data can indicate the position and orientation of the handheld within a local coordinate system Petition 870190109803, of 10/29/2019, p. 5/50 2/36 associated with a target structure, such as a repositionable structure or a mobile platform. The method also includes detecting the movement of the handheld device. The method also includes determining a spatial relationship of the handheld device to the target structure based, at least partially, on the initial position and orientation data and on the movement data. [0004] In a specific embodiment, the portable device includes a calibration component to communicate with a local positioning system to determine an initial position and orientation of the portable device within a local coordinate system associated with a target structure. The handheld device also includes at least one motion sensor to detect the handheld's movement. The handheld device may include a processor for determining a measured position and orientation of the handheld device based on the handheld's initial position and orientation within the local coordinate system and the detected motion of the handheld device. In addition, or alternatively, the handheld device can have a wired or wireless connection to a remote processor for determining measurements based on a movement detected by the handheld device. [0005] A specific positioning system includes a plurality of local position detectors to determine a position of a location of interest on or within a target structure within a first coordinate system. The positioning system also includes a processor to determine measurements in a second target structure coordinate system based on the position and orientation of the target structure in the first coordinate system and one or more electronic representations of the target structure. The positioning system also includes a portable device, such as a device for detecting relative movement. The device Petition 870190109803, of 10/29/2019, p. 6/50 3/36 portable device includes a memory to store an initial position and orientation of the portable device within the second coordinate system. The handheld device also includes at least one motion sensor to detect the handheld's movement. The portable device also includes a processor for determining a measured position and orientation of the portable device based on the initial position and orientation of the portable device within the second coordinate system and the detected movement of the portable device. [0006] The characteristics, functions and advantages that have been described can be achieved independently in several modalities or can be combined in still other modalities, additional details of which are described with reference to the description and drawings below. BRIEF DESCRIPTION OF THE DRAWINGS [0007] Figure 1 is a block diagram of a first specific modality of a local positioning system that includes a portable device; figure 2 is an illustration of a second specific embodiment of a local positioning system that includes a portable device; figure 3 is a flowchart of a first specific modality of a method for determining a spatial relationship of a portable device in relation to a target structure; figure 4 is a flowchart of a second method-specific method for determining a spatial relationship of a portable device with respect to a target structure; and figure 5 is a flowchart of a third method-specific embodiment for determining a spatial relationship of a portable device with respect to a target structure. Petition 870190109803, of 10/29/2019, p. 7/50 4/36 DETAILED DESCRIPTION [0008] This description refers to a portable device that interacts with a positioning system to determine position information in locations where the positioning system cannot independently determine position information due to, for example, limitations access, expenses, time, capacity or functional of the positioning system. The portable device can be useful for the maintenance or assembly of large-scale structures, such as airplanes, vessels, land vehicles, spaceships, oil platforms, heavy equipment, or other repositionable or stationary structures. Such a target structure can be assembled in installations that use positioning systems to improve the precision, accuracy or speed of an assembly or maintenance process. Certain positioning systems, such as laser positioning systems, may be limited to uses that are in direct line of sight with respect to a position detection device. Consequently, positions within the target structure can be difficult to determine. The handheld device can be operable to determine the position information within the target structure or in other locations. In addition, the handheld device can use position and orientation information to provide other information useful to an operator, such as identifying the locations of specific components, identifying components based on location, gathering and storing information in specific locations, and so on. [0009] Figure 1 is a block diagram of a specific modality of a local coordinate positioning system, also known as a local positioning system. The system includes a portable device 120 that interacts with a positioning system 102. Positioning system 102 can include Petition 870190109803, of 10/29/2019, p. 8/50 5/36 a local positioning system or a non-local positioning system. A positioning system can measure locations and describe locations in the specific coordinate system associated with an object being measured. A measuring device, such as handheld 120, can be calibrated for the local coordinate system using the reference information in the specific coordinate system, such as an inspection marker or other reference point. The reference information is used to determine the relative position and orientation of the measuring device towards a target structure. Local positioning systems can use optical or electromagnetic measurement sensors (for example, laser measurement systems and the like. To take measurements, and then transform these device-based measurements into measurements defined in the local coordinates of the target structure. One non-local positioning system can determine or specify locations in relation to a reference coordinate system that is not associated with the target structure. For example, the reference coordinate system can be associated with a building or other geographic location, such as a dry dock installation. A gross exit (eg latitude - longitude) from a global positioning system that uses satellite orbital positions as references would be an example of a non-local positioning system, as is a given measurement by a simple tape measure (which provides the distance relative to the edge of u accommodation) In their basic forms, these devices provide measurements without considering the coordinate system of the object being measured. It is up to the user (or some other system) to determine how these results apply to the situation. [00010] Certain 102 positioning systems may not work well when obstructions are present. For example, Petition 870190109803, of 10/29/2019, p. 9/50 6/36 laser positioning themes can only operate when a direct line of sight is available between a laser source and a target. In addition, certain positioning systems 102 can be difficult, expensive or time-consuming to set up or to add new measurement locations. For example, it may be undesirable to set up a laser positioning system to identify target positions at each possible location within an area served by the laser positioning system. In a specific embodiment, the portable device 120 allows the determination of position and orientation information in areas not directly served by the positioning system 102 (for example, because the areas are outside the direct measurement area of the positioning system 102 or because obstructions reduce the accuracy of measurements made by the positioning system 102). For example, large-scale manufacturing and maintenance facilities may use a local positioning system, such as positioning system 102, to guide assembly and maintenance functions. Examples of such facilities may include aircraft assembly facilities, ship building facilities, oil platforms and other facilities for the manufacture and maintenance of large equipment. Such facilities may use line of sight measurement systems, such as laser positioning systems, to determine position information outside a system being assembled (for example, an airplane, a ship, etc.). However, such line of sight measurement systems can be difficult, costly or cumbersome to use in certain locations, such as locations within the system being assembled. Handheld device 120 can be used by personnel working within the system being assembled to determine position and orientation information. Petition 870190109803, of 10/29/2019, p. 10/50 7/36 [00011] The positioning system 102 can include one or more local position detectors 104 to determine a position of a location on a target structure (not shown in figure 1). In a specific embodiment, local position detectors 104 include laser devices adapted to transmit laser beams 162 at target locations 152. Positioning system 102 may also include a processor 106. Target locations 152 may be at known locations on the target structure and can be used by processor 106 and coordinate transformation software to calibrate local position detector (s) 104 to a target structure coordinate system. The target structure may include an airplane, spacecraft, land vehicle, vessel, or other repositionable structure that is moved to a location for maintenance, assembly, or other purposes. In other examples, the target structure can be a stationary object. The processor 106 of the positioning system 102 may have access to a database 108 that includes electronic representations 110 of the target structure. For example, electronic representations 110 may include electronic diagrams, drawings or models of the target structure, such as computer aided design drawings or three-dimensional models that provide local position information or locations of interest on the target structure. In addition, electronic representations can include a design or drawings as constructed or measured data (for example, measurements made for use by the local positioning system) that indicate specificities (for example, dimensions or characteristics) of the target structure. [00012] The portable device 120 includes a calibration component 126. The calibration component 126 communicates with the positioning system 102 to specify an initial position and orientation of the portable device 120. For example, the position and orientation Petition 870190109803, of 10/29/2019, p. 11/50 8/36 initials can be specified with reference to a specific location on the target structure. [00013] The portable device 120 may also include at least one motion sensor 128. For example, the at least one motion sensor 128 may include one or more inertial sensors (for example, microelectromechanical sensors, such as accelerometers or gyroscopes) capable of to measure linear and rotational acceleration and / or speed. The at least one motion sensor 128 can detect the movement of the portable device 120 after the initial position and orientation of the portable device 120 has been determined. To illustrate, the portable device 120 can be initialized in a calibration frame 150. The position and orientation of the calibration frame 150 can be known or can be determined by the positioning system 102. The position and orientation of the calibration frame 150 they can be stored in a memory 124 of the portable device 120. In a specific embodiment, the position and orientation of the calibration frame 150 are initialized in memory 124 as an original position and orientation of the local coordinate system. That is, the initial position and orientation of the handheld 120 within the calibration frame 150 are the sources from which the relative movements of the handheld 120 are measured. In this embodiment, the position and orientation of the portable device 120 can be determined by the distance and direction away from the original location (i.e., the calibration table 150). In other embodiments, other locations of origin are selected, such as a predetermined reference location on the target structure. A spatial relationship between the calibration frame 150 and the local position detectors 104 can be known or determined. For example, the calibration table 150 can be placed in a known location on the target structure. In addition or alternatively, the cali table Petition 870190109803, of 10/29/2019, p. 12/50 9/36 arm 150 may include target locations 152 that are detectable by local position detectors 104 to allow positioning system 102 to determine the position and orientation of the calibration frame 150 on the target structure. After the positions and relative orientations are determined for movements of the handheld device 120, the relative positions and orientations are used together with the position and orientation of the calibration table to compute the position and the resulting orientation of the handheld device 120 in the control system. coordinates of the target structure. [00014] The portable device 120 may also include a processor 122. Processor 122 may determine a measured position and orientation of the portable device 120 based on the initial position and orientation of the portable device 120 and the detected movement of the portable device 120. For For example, processor 122 can determine the measured position and orientation of handheld device 120 within the target structure's local coordinate system. In a specific embodiment, memory 124 is accessible to processor 122. Memory 124 can store target component data 140. Target component data 140 can specify component locations of the target structure in the local coordinate system. Processor 122 can determine a spatial relationship of handheld 120 for at least one of the components based on the measured position and orientation of target component data 140. For example, processor 122 can determine a distance from handheld 120 to at least one of the components, a direction of the portable device 120 for at least one of the components or both. [00015] Handheld device 120 may also include an input device 132. Input device 132 can be used to receive target component data 140 that specifies a Petition 870190109803, of 10/29/2019, p. 13/50 10/36 target component of the target structure components, or the location of data of interest 142 on the target structure component. For example, input device 132 can be used by an operator to enter information that identifies a specific component of the target structure that the operator wants to locate (i.e., the target component). Processor 122 can determine information about a spatial relationship between handheld device 120 and target component based on the measured position and orientation of target component data 140. [00016] Handheld device 120 may also include a pointer 134. Pointer 134 may be adapted to point to a location of the target component based on the spatial relationship. For example, pointer 134 may include a needle, one or more light indicators, a display, or other device on handheld device 120 or attached to handheld device 120 that points in the direction of the target component. In another example, pointer 134 may include a laser or other light-based pointing device. In this example, pointer 134 can direct a beam of light towards the target component. An integrated video camera (not shown) can also be used to indicate the pointing direction of the portable device 120. [00017] In a specific embodiment, the portable device 120 includes an indicator 138. Processor 122 can activate indicator 138 based on the spatial relationship of the portable device 120 to the target structure. In various embodiments, indicator 138 includes different types of indicating devices that are triggered to indicate various conditions. For example, processor 122 may activate indicator 138 when handheld 120 is within a predetermined limit distance from the target component. In another example, processor 122 can activate indicator 138 when the device Petition 870190109803, of 10/29/2019, p. 14/50 11/36 tactile 120 is within or approaching a specific portion of the target structure, such as an area that is prohibited for certain personnel or an area that requires special protective equipment for entry. In another example, indicator 138 may include an audible indicator, a visible indicator, or another indicator to indicate to the user that the handheld 120 is close to the target component (for example, when a second limit distance is satisfied). [00018] In another example, indicator 138 includes a laser projection device that can trace a contour or other shape associated with the target component or associated with a location where the target component should be. For example, when target component data 140 identifies a component to be installed within the target structure, the laser projection device can project an outline of the target component at an installation location of the target component. Indicator 138 may also include a device for indicating that the target component may be out of sight of the operator, such as behind a second component. To illustrate, when the target component is stored within a panel, in a different compartment, or behind another component of the handheld 120, indicator 138 may indicate that the target component is out of the operator's line of sight. [00019] Processor 122 can trigger indicator 138 when a limit distance is satisfied. Processor 122 can trigger different indicators 138 when different limit distances are satisfied. For example, a first indicator can be triggered when a first limit distance is satisfied and a second indicator can be triggered when a second limit distance is satisfied. The first limit distance can be different from the second limit distance, and the first indicator can be different from the second indicator. Per Petition 870190109803, of 10/29/2019, p. 15/50 For example, the first indicator can include a first audible tone and the second indicator can include a second audible tone. In another example, the first indicator may include an audible tone and the second indicator may include a visible indication, such as a light. In yet another example, the first indicator may include a visual indication such as a first lamp and the second indicator may include a second visual indication such as a second lamp. In this example, the lights can be visually distinct, such as colors, different or in different locations on handheld 120. [00020] The limit distance The limit distances can be determined based on an identification of the target component. For example, a greater limit distance can be determined for a larger component, and a smaller limit distance can be determined for a smaller component. In another example, a higher limit distance can be selected for a component that is expected to be visible and a lower limit distance can be selected for a target component that is expected to be obscured (for example, behind a panel or another component). In a specific modality, one or more indicators are activated to indicate a direction for the target component. [00021] Handheld device 120 may include a sensor 136. Sensor 136 may include a digital image sensor, a security sensor (for example, a gas or thermal sensor), a magnetometer, a test or inspection sensor not destructible (for example, an ultrasonic test sensor, a radiographic test sensor, an eddy current test sensor, etc.), another sensor, or any combination thereof. Sensor 136 can capture data at the position of handheld device 120. Processor 122 can store captured data 144 (e.g., a captured digital image) with position data in memory 124. Petition 870190109803, of 10/29/2019, p. 16/50 13/36 [00022] In a specific embodiment, handheld device 120 includes an interface 130. Interface 130 may allow communication between handheld device 120 and one or more remote devices (such as a device for positioning system 102 or database 108). Interface 130 may include a wired interface, such as an Ethernet interface, a Universal Serial Bus (USB) interface, another standard or proprietary wired interface, or any combination thereof. Interface 130 may also or alternatively include a wireless interface. For example, interface 130 can send wireless transmissions 160 to communicate with the remote device. Wireless transmissions 160 can include infrared transmissions, radio frequency transmissions or any wireless transmission. Interface 130 can be used to initialize handheld device 120 or send data collected by handheld device 120 to the remote device. For example, wireless transmissions 160 can be used to send the descriptive data of the measured position of the portable device 120 and the captured data 144 to the remote device. In another example, interface 130 can be used to send target component data 112 from database 108 to handheld device 120. To illustrate, database 108 may include electronic representations 110 of the target structure. Electronic representations 110 of the target structure can include design drawings or as constructed or three-dimensional electronic models of the target structure. Thus, the electronic representations 110 of the target structure can be relatively large electronic files. Due to size or cost restrictions, the portable device memory 124 may not be able to store all electronic representations 110 of the target structure. Consequently, the memory 124 of the portable device 120 can store only a relevant portion of the electronic representations. Petition 870190109803, of 10/29/2019, p. 17/50 14/36 110 of the target structure, such as the target component data 112, 140. [00023] In operation, the portable device 120 can be used by an operator to locate the target component of the target structure, gather the data detected within the target structure, to identify a component in a specific location of the target structure, or for other purposes. To illustrate, an operator attempting to locate the target component can enter target component data 140 that identifies the target component via input device 132. The position of the target component can be determined by processor 122 of handheld 120 or by a processor 106 from a remote device based on electronic representations 110 of the target structure and target component data 112, 140. Indicator 138 can provide information to the operator regarding a direction, a distance, or the direction and the distance to the target component. In addition, indicator 138 can provide other information to assist the operator in locating the target component. For example, indicator 138 may include a display screen that displays a text description of the target component or the location of the target component, displays a photograph or diagram that illustrates the target component, etc. In another example, indicator 138 may include one or more lights, moving arrows or other devices that indicate the direction or distance to the target component. In yet another example, indicator 138 may include a pointer, such as a visible laser that points in the direction of the target component. In yet another example, indicator 138 may include one or more audible or haptic output devices to indicate a direction or distance to the target component. To illustrate, a vibrating element or a bell may be triggered when the operator moves the portable device 120 within a predetermined distance from the target component. The dis Petition 870190109803, of 10/29/2019, p. 18/50 15/36 predetermined quantity can be selected based on how easily the operator would be expected to identify the component. For example, the predetermined distance may be shorter (that is, closer to the component) for the smallest components than for the largest components. In another example, the predetermined distance may be shorter when the component would normally be obscured from view by the operator by another component. [00024] To illustrate another operational example, an operator can use handheld device 120 to gather or store captured data 144. For example, when the operator is performing an inspection of the target structure, handheld device 120 can be used to capture and store inspection findings along with location information. To illustrate, the portable device 120 can be carried by the operator during a non-destructive test and inspection. The sensor 136 of the portable device 120 can be used to capture the captured data 144, which can be stored with the location information that indicates where the captured data 144 was captured. In another example, sensor 120 may include an image sensor. When the operator identifies a concern, the image sensor can be used to capture an image of the area of concern, which can be stored as the captured data 144 along with the location where the image was captured. [00025] To illustrate another operational example, the portable device 120 can be used by the operator to assist with the identification of a component at a specific location in the target structure. For example, input device 132 can be used by the operator to select a specific position. Processor 122 of handheld device 120 or processor 102 of remote device can determine which components of the target structure are nearby Petition 870190109803, of 10/29/2019, p. 19/50 16/36 of the specific position. In a specific embodiment, pointer 134 can be used by the operator to point to the specific component. For example, pointer 134 may include a laser device that generates a visible beam of light. The operator can point the beam at the component and use input device 132 to generate a query regarding the component's identity. The query may request information regarding the component, such as a component identification, a component purpose, maintenance or assembly data for the component (for example, torque specifications), or any combination thereof. [00026] Figure 2 is an illustration of a specific modality of a local positioning system. The local positioning system can include one or more local position detectors 202 to determine a position of a location of interest within or on a target structure 200 within a first coordinate system. In a specific embodiment, local position detectors 202 include laser devices adapted to direct laser beams 220 to calibration targets 222 and to determine the positions of calibration targets 222 within the first coordinate system. In addition, each of the local position detectors 202 can include a target so that each of the local position detectors 202 can determine the positions of the other local position detectors 202. Although only three local position detectors 202 are illustrated in 2, the local positioning system can include more or less local position detectors 202 depending on the specific implementation. [00027] Local position detectors 202 can be placed at known locations within the first coordinate system, such as a coordinate system associated with the manufacturing or maintenance area. Alternatively, the position detector Petition 870190109803, of 10/29/2019, p. 20/50 17/36 202 can be placed at a location that is initially unknown, and then use known reference locations on the target structure to determine its relative position and orientation with respect to the target in the target structure's coordinate system. The target structure can include an airplane, spaceship, land vehicle, vessel, or other repositionable structure that is moved into the local coordinate system for maintenance, assembly, or other purposes. Target structure 200 can also include stationary objects such as buildings, or large objects that move at a constant speed (zero acceleration) on which all elements within the environment are stationary relative to each other, such as a cruise ship or an aircraft carrier. Calibration targets 222 can be applied to target structure 200 for detection by local position detectors 202, or calibration targets 222 can be known location characteristics specified in the local coordinates of target structure 200. For example, as illustrated in 2, the target structure 200 is an airplane and the calibration targets 222 are placed on the wings 212 and a nose of an airplane fuselage 210. More or less calibration targets 222 can be used depending on the specific target structure 200, the locations of the local position detectors 202 or other factors that affect the determination of position for target structure 200 within the first coordinate system. [00028] Local position detectors 202 can interact with a processor 214 to determine the locations of calibration targets 222 within the first coordinate system. Processor 214 or local position detectors 202 can determine the position and orientation of target structure 200 within the first coordinate system based on the locations of calibration targets 222 and based on one or more electronic representations of the structure Petition 870190109803, of 10/29/2019, p. 21/50 18/36 target 200. Electronic representations of target structure 200 may include electronic diagrams, drawings or models of target structure 200, such as computer aided design drawings or three-dimensional models. In addition, electronic representations may include a design or drawings as constructed or measured data (for example, measurements made for use by the local positioning system) that indicate specificity of the target structure. [00029] Processor 214 can also determine a second coordinate system based on the position and orientation of target structure 200 within the first coordinate system. For example, the first coordinate system may be a local coordinate system (that is, a coordinate system associated with the target structure) or a non-local coordinate system (that is, a coordinate system associated with something other than the target structure, such as a building or a global reference system. Examples of non-local coordinate systems include coordinate systems that specify locations with reference to features that are not on the target structure, such as global systems which specify locations based on distances or directions for global characteristics (for example, global positioning coordinates, latitude and longitude, and so on.) In the local coordinate system, locations can be specified with reference to one or more locations or known characteristics defined in relation to the target structure. [00030] In the second coordinate system, positions can be specified by distance or direction of characteristics of target structure 200. For example, a position can be specified by a distance and a direction from the nose of the fuselage 210 of the plane illustrated in the figure 2. Processor 214 can determine a relationship between positions in the first coordinate system and the second system Petition 870190109803, of 10/29/2019, p. 22/50 19/36 coordinate theme based on the position and orientation of target structure 200 within the first coordinate system and the electronic representations of target structure 200. [00031] Figure 2 also shows a portable device 234 arranged within a calibration frame 230. For example, portable device 234 can be a portable device such as portable device 120 of figure 1 and calibration frame 230 can be a calibration table such as the calibration table 150 of figure 1. The calibration table 230 can be placed in a known position and orientation in the first coordinate system of the target structure. A spatial relationship between calibration frame 230 and local position detectors 202 can be known. For example, the calibration table 230 and the local position detectors 202 can be placed in fixed positions, known within the first coordinate system. The calibration frame 230 can be located on or within the target structure 200, or the calibration frame 230 can be placed in other locations as long as its position and orientation are defined in the local target coordinate system. In addition, or alternatively, calibration frame 230 may include one or more targets 232 that are detectable by local position detectors 202 to allow the local positioning system to determine the position and orientation of calibration frame 230 with respect to the local position detectors 202 within the first coordinate system of the target structure. [00032] The portable device 234 can include a memory to store an initial position and orientation of the portable device 234. The initial position and orientation of the portable device 234 can be stored in relation to the first coordinate system, the second coordinate system , or both. In a specific embodiment, the position of the calibration table 230 is determined Petition 870190109803, of 10/29/2019, p. 23/50 20/36 as a home position for the second coordinate system. Thus, the memory of the portable device 234 can initially be set to zero in the calibration table 230, which indicates no movement from the home position. [00033] The handheld device 234 may also include one or more motion sensors to detect the movement of the handheld device 234. For example, motion sensors may include one or more microelectromechanical systems (MEMS), such as a sensor device that measure relative acceleration and speed. The motion detected by the motion sensors can be used by the processor 214 to calculate a position and orientation of the handheld device 234 after it is moved. The processor 214 may also be configured to provide an estimate of linear and rotational displacement during movement. In a specific embodiment, processor 214 is incorporated into handheld device 234. In another embodiment, processor 214 is external to handheld device 234. Processor 214 can determine a calculated position and orientation of handheld device 234 within target structure 200 or near target structure 200 based on the initial position and orientation of the handheld device 234 (e.g., within the calibration frame 230) and based on the movement of the handheld device 234 detected by one or more motion sensors. [00034] To illustrate, local position detectors 202 may include line of sight detectors such as laser devices. Thus, positions in portions of target structure 200 that are not within line of sight of local position detectors 202, such as within fuselage 210, within a wheel well (not shown), behind an obstruction, such as such as wings 212 or other components of target structure 200, may not be detectable by local position detectors 202. Consequently, handheld device 234 may Petition 870190109803, of 10/29/2019, p. 24/50 21/36 be used to determine the obscured positions of local position detectors 202 or where the placement of a target 222 would be uncomfortable or undesirable. In these cases, measurements will be based on the initial position and orientation of the handheld device 234 and the relative motion detected from the handheld device 234. [00035] In operation, handheld device 234 can be used by an operator to locate components of target structure 200, to gather or store information about a specific location of target structure 200, to identify a component in a specific location of the target structure, or for other purposes. To illustrate, an operator attempting to locate a component of target structure 200 may enter the information identifying the component into the handheld device 234 or a computing device (not shown), which downloads the information to the handheld device 234. One position of the component can be determined based on the electronic representations of the target structure 200. The processor 214 can determine the position of the component in relation to the target structure 200, in relation to the location of the portable device 234 (for example, within the calibration frame 230) or in relation to another location that can be readily identified by the operator. [00036] In a specific embodiment, the portable device 234 includes an operator interface to provide information to the operator regarding the location of the component. For example, the operator interface can include a display screen. The display screen can display information for the operator regarding a direction, a distance, or the direction and distance for the component. In addition, the display screen can provide other information to assist the operator in locating the component. For example, the operator interface can display a text description of the component or the location of the component, or it can display a photograph or diagram that Petition 870190109803, of 10/29/2019, p. 25/50 22/36 illustrates the component, etc. The operator interface can include other output devices in addition to or instead of a display screen. For example, the operator interface may include one or more indicators (for example, lights, moving arrows, etc.) adapted to indicate the direction or distance to the component. In another example, the operator interface may include a pointer, such as a visible laser, that points in the direction of the component. In yet another example, the operator interface can include one or more audible or haptic output devices to indicate the direction or distance to the component. To illustrate, a vibrating element or a bell may be triggered when the operator takes the handheld device 234 within a predetermined distance from the component. The predetermined distance can be selected based on how easily the operator would be expected to identify the component. For example, the predetermined distance may be shorter (that is, closer to the component) for the smallest components than for the largest components. In another example, the predetermined distance may be shorter when the component would normally be obscured from view by another component (for example, behind a panel). [00037] To illustrate another operational example, an operator can use the handheld device 234 to gather or store the position data. For example, when the operator is performing an inspection, the handheld device 234 can be used to ensure the inspected locations are correct or to store the inspection findings with the correct location information. To illustrate, handheld device 234 can be carried by the operator during a non-destructive test. The results of the non-destructive tests can be stored with the output of the handheld device 234 to ensure that the appropriate locations have been tested and to subsequently allow to find the positions where the tests have been Petition 870190109803, of 10/29/2019, p. 26/50 23/36 executed. In another example, handheld device 234 can include or be included within a handheld test device or a camera that includes a sensor used to test or inspect target structure 200. When the operator identifies a concern, handheld device 234 can be used to store the location and the detected data. For example, the operator can capture a digital image of a concern area and the position of the concern area can be stored in the digital image. [00038] To illustrate another operational example, the handheld device 234 can be used by the operator to assist with the identification of a component at a specific location on the target structure 200. For example, the handheld device 234 may include an entry that the operator can use to select a specific position. Processor 214 can determine components of target structure 200 that are close to the specific position. The input can include a trigger, a button or another input device. In a specific embodiment, the portable device 234 includes a pointing device operable by the operator to point at the specific component. For example, handheld device 234 may include a laser device that generates a visible beam of light. The operator can point the beam at the component and select the input to generate a query. The query may request information regarding the component, such as a component identification, a component purpose, maintenance or assembly data for the component (for example, torque specifications), or any combination thereof. [00039] Figure 3 is a flow chart of a first method-specific modality to determine a spatial relationship of a portable device in relation to a target structure. The method includes, in 302, initializing the handheld device. For example, the device by Petition 870190109803, of 10/29/2019, p. 27/50 The tactile 24/36 may include a device capable of detecting relative motion, such as the handheld device 120 of figure 1 or the handheld device 234 of figure 2. Initialization of the handheld device may include determining a location of the handheld device in relation to the target structure. . For example, the handheld device can be placed within a calibration table, such as calibration table 150 in figure 1 or calibration table 230 in figure 2. The target structure can include a repositionable structure or a mobile platform, such as an airplane, spaceship, vessel, land vehicle, or other structure, such as a building. To illustrate, while the handheld is within the calibration frame, the motion data in a handheld memory can be reset to zero indicating that the handheld is at a zero or home location. In another illustrative example, another value that focuses on the position and orientation of the calibration table can be stored in the memory of the portable device. Thus, an initial position and orientation of the portable device can be stored in the portable device's memory. [00040] The method also includes, in 304, detecting the movement of the portable device within or on the target structure. For example, the portable device may include inertial sensors or other motion detection sensors capable of detecting the movement of the portable device within or on the target structure. The method also includes, in 306, determining the movement data based on the detected movement. For example, where the handheld device includes inertial sensors, the motion detected by the inertial sensors can be provided to a handheld device processor to determine a direction, distance or other information that indicates the handheld's movement. Motion data can be stored in the memory of a portable device or transmitted Petition 870190109803, of 10/29/2019, p. 28/50 25/36 dos to a remote device for storage. [00041] The method also includes, in 308, determining a spatial relationship of the portable device in relation to the target structure based at least partially on position data and movement data. For example, when motion data indicated a movement from the starting position of 100 feet (30.4 meters) in a specific direction, motion data and start position data can be used to determine a position of the handheld relative to to the target structure. [00042] Figure 4 is a flow chart of a second method-specific modality for determining a spatial relationship from a portable device to a target structure. The method in 402 includes moving a repositionable target structure (such as an airplane, spaceship, vessel, or ground vehicle) into position. The method also includes, in 404, determining the position and orientation of the target structure within a local coordinate system. For example, when the target structure is an airplane, the position and orientation may include an airplane location within the local coordinate system as well as an angle, direction, elevation and so on for the airplane within the coordinate system. locations. [00043] The method may also include, in 406, accessing a memory that stores the target component data. Target component data can specify the locations of target structure components in the local coordinate system. For example, target component data can be determined based on one or more electronic representations of the target structure. To illustrate, target component data can be determined based on a plurality of design drawings or as constructed from the target structure, a plurality of three-dimensional computerized design models or as constructed from the target structure or other Petition 870190109803, of 10/29/2019, p. 29/50 26/36 electronic representations that include information about the positions of the specific components of the target structure. [00044] The method may also include, in 408, initializing a portable device. The handheld device can be adapted to be carried by a user during interactions with the target structure and can determine the position and orientation of the handheld device in the local coordinate system based on the handheld's movements. For example, the portable device may include the portable device 120 of figure 1 or the portable device 234 of figure 2. In a specific embodiment, initializing the portable device includes, in 410, storing the position and orientation data in a memory of the portable device. Position and orientation data can indicate the position and orientation of the handheld device within the local coordinate system. For example, position and orientation data can indicate an initial or starting position and orientation of the handheld device (such as within a calibration table) when the handheld device is initialized. The handheld initialization may also include, in 412, receiving target component data that specifies a location in the local coordinate system of at least one target component of the target structure. A target component can refer to a component of the target structure that an operator is trying to locate. Target component data that indicates the location of at least one target component can be stored in the handheld's memory. [00045] The method may also include, in 414, detecting the movement of the portable device within the local coordinate system. For example, the handheld device can be moved within the target structure. Motion can be detected using inertial sensors (for example, microelectromechanical inertial sensors) or other motion detection devices built into the portable device Petition 870190109803, of 10/29/2019, p. 30/50 27/36 useful. In a specific embodiment, the local positioning system determines the position and orientation of the target structure within the local coordinate system using a line of sight positioning system, such as a laser positioning system. Consequently, positions within the target structure may not be readily determined as they may not be within direct line of sight to the local positioning system. In such embodiments, the handheld device may allow a user to determine the position and orientation of the handheld device or a target component within the target structure without a line of sight for the local positioning system. [00046] The method can also include, in 416, determine the movement data based on the detected movement. For example, motion data can indicate a direction, distance, or time of movement of the handheld device based on the detected motion. The method may also include, in 418, determining a spatial relationship of the handheld device to the target structure based at least partially on the position and orientation data determined at handheld initialization and based on the motion data determined based on in the detected movement. For example, determining the spatial relationship may include, in 420, determining a distance to the or a position of the handheld device in relation to one or more components of the target structure, or to a location of interest defined in the coordinate system of the structure- target. To illustrate, when a target component is identified and stored in the portable device's memory, a distance from the portable device to the target component can be determined. In other uses, where no target components have been identified, the handheld device can determine a location in the target structure where the handheld device is located based on the PoPetition data 870190109803, of 10/29/2019, pg. 31/50 28/36 initial position and orientation and movement data. The method can include one or more additional actions depending on a specific task to be performed by the handheld device. For example, when the handheld device is being used to identify a specific component of the target structure, the method may include, in 422, receiving an entry on the handheld device that selects a specific position (for example, a location of interest). To illustrate while a user is within the target structure, the user can select a trigger, button or other input device to have the handheld store the position and orientation data associated with the handheld's current position and orientation . The specific position of the handheld device can be used to search the target component data to identify the components in a specified spatial relationship for a specific position, in 424. For example, when the user is trying to determine the identity of a specific component while within the target structure, the user can place the portable device close to or in contact with the specific component in question and can store the specific position of the portable device. Target component data can be searched based on the position of the handheld to identify the specific component or to identify a set of components close to the specific position. [00047] In another example, the method may include, in 426, capturing data at the position of the portable device. The captured data may include, for example, data captured by an integral sensor with, coupled to, or associated with the handheld device. For example, the handheld device may include a sensor, such as a digital image sensor, a thermal sensor, another non-destructive test sensor, or any combination thereof. The sensor can capture the data Petition 870190109803, of 10/29/2019, p. 32/50 29/36 on the location of the handheld device. In another example, a user who carries the handheld device can carry a sensor and a separate detection device capable of capturing the data. The captured data can be stored, in 428, in association with the descriptive data of the position of the portable device. For example, the user can provide an entry for the handheld device to store a position of the handheld device as data is gathered. In another example, the handheld device can automatically store the position when the captured data is captured or stored. The captured data, the position data, or both, can be sent to a remote device for analysis, at 430. [00048] To illustrate a specific use, the portable device can be used by maintenance or assembly personnel to gather data. non-destructive test data associated with the target structure. Non-destructive test data can be gathered by the handheld device or by another device. The non-destructive test data can be stored with the position data determined by the handheld device so that the location where the non-destructive test data has been gathered is correctly identified. In another example, during an inspection, when an area of concern is identified, a digital image or other data associated with the area of concern can be captured and stored in association with data describing the position of the area of concern and the data is sent. to a remote device for analysis. [00049] In yet another example, the method may include, in 432, determining an identification of a specific component of the target structure that is close to the handheld device based on the spatial relationship of the handheld device to the target structure and data target component. For example, when a user of the handheld device is within the target structure and wants to identify a component Petition 870190109803, of 10/29/2019, p. 33/50 30/36 specific user, the user can provide an entry for the portable device that indicates searching for components within a predetermined spatial relation of the portable device. The search can include a search of a portable device memory or a search of a remote database that includes the target component data. For example, the position of the handheld device can be transmitted wirelessly to a remote device that implements the search of the remote database. Based on the location of the handheld device, the descriptive data of the target structure and the target component data, the identification of the specific component can be determined. [00050] In yet another example, the method may include, in 434, selecting one or more limit distances based on an identification of a target component and, in 436, activating one or more indicators when one of the one or more limit distances is satisfied. For example, when the target component is identified during startup or when using the handheld, the one or more limit distances associated with the target component can be identified. A specific threshold distance can be determined based on, for example, a location of the target component, a size of the target component, or other information that generally indicates how difficult or easy to find the target component is expected to be. [00051] In a specific modality, a first limit distance is used to activate a first indicator and a second limit distance is used to activate a second indicator. For example, the first limit distance can be satisfied when the handheld is more than a specific distance from the target component. The first indicator can include a pointer or other indicator associated with the handheld device. The first indicator can be activated to indicate that the handheld device is more than the first limit distance from the target component. The first indicator can provide Petition 870190109803, of 10/29/2019, p. 34/50 31/36 an indication of a direction for the target component, a distance to the target component, other information useful for locating the target component (such as a direction for pursuing a specific identifiable area of the target structure, for example, cargo area) or any combination thereof. The second limit distance can indicate that the handheld is within a second distance from the target component, and the second indicator can be activated when the second limit distance is satisfied. The second limit distance may be closer to the target component than the first limit distance. To illustrate, during use, the first limit distance may indicate that the handheld device is in or near a compartment or other enclosed area that includes the target component. The second limit distance may indicate that the handheld is within a predetermined distance, such as the length of the arm, a specified number of feet or meters, or another distance that is relatively close to the target component. [00052] Figure 5 is a flow chart of a third specific modality of a method for determining a spatial relationship of a portable device in relation to a target structure. The method includes, in 502, calibrating a local positioning system (LPS). For example, the LPS may include one or more position detectors, such as local position detectors 104 in figure 1 or local position detectors 202 in figure 2. Position detectors can interact with one or more targets to determine target locations. In a specific modality, the LPS is calibrated by having at least one position detector measure the calibration targets at known locations. The calibration targets may be at known locations defined in the coordinate system associated with a location (for example, an assembly or maintenance area) or at known locations of a co-ordination system 870190109803, from 10/29/2019, p. 35/50 32/36 associated with a target structure (for example, an airplane). [00053] In a specific embodiment, the portable device includes an inertial measurement unit (IMU) to detect the movement of the portable device and measure the relative positions for the target structure. The IMU can operate in one of two or more operating modes, such as an absolute mode 506 and a relative mode 530. The method may include determining the operating mode at 504. [00054] Absolute mode 506 can be used for point measurements, such as to determine a position of a target location that is out of line of sight for the LPS. For example, when the IMU is used in absolute 506 mode, the method may include placing the IMU on a carrier, at 510. The carrier can be a portable or mobile calibration table, such as calibration table 150 in figure 1 or the calibration table 230 in figure 2. The carrier may include one or more LPS targets, such as target locations 152 in figure 1, which allow the LPS to determine the position and orientation of the carrier. The IMU and the carrier can be moved to a visible location (i.e., a location with a line of sight for a local LPS positioning device, such as a laser measurement device). A more accurate measurement can be determined if the visible location is relatively close to the target location. [00055] The method may also include determining the location of the IMU and the carrier (for example, the position and orientation of the IMU and the carrier) in the visible location using the LPS to take measurements of the location, in 512. An IMU memory can be modified in the visible location. For example, the acceleration and speed data stored by inertial measurement components of the IMU can be reset to zero and the IMU can start capturing motion data at 514. Locations that are not visible to the system Petition 870190109803, of 10/29/2019, p. 36/50 33/36 local positioning theme can also be used if local coordinate positions and orientations are known. [00056] The method may include capturing motion data as the IMU is moved to the target location, at 516. When the IMU is at the target location, the IMU's memory can again be modified to store the location information -target. For example, the acceleration and velocity data stored in the inertial measurement component of the IMU can be reset again. A user can provide an entry in the IMU to indicate that the target location has been reached. To illustrate the user can select a button or other input device to indicate that the target location has been reached. The motion data captured while moving the IMU to the target location can be transmitted to a remote device, such as a component of the LPS. In one embodiment, a first memory stores position data relating to moving the IMU from the carrier to the target location and a second memory stores position data relating to moving the IMU from the target location back to the carrier. Alternatively, position data relating to moving the IMU from the carrier to the target location can be reset to the target location and replaced with position data relating to moving the IMU from the target location back to the carrier. [00057] The IMU can be returned to a known location (for example, a known position and a known orientation), such as the carrier, at 518. The position of the target location can be calculated, at 520, based on the data of position stored by the IMU. The calculation process depends on the type of sensors used in the portable device. If the handheld device uses an IMU equipped with accelerometers and gyroscopes, the process may involve dual integration of acceleration data to produce compen Petition 870190109803, of 10/29/2019, p. 37/50 34/36 relative position outputs and integration of rotational speed data to produce angular compensations. The calculation can include a float compensation adjustment that uses known position and orientation data, along with the known zero acceleration and velocity conditions at the beginning and end of the movement sequence to correct sensor fluctuation during movement. If more target points are to be measured, in 522, the method can reset the IMU memory again, in 514. When no more target points are to be measured, in 522, the method ends in 524. [00058] The relative mode 530 can be used as an interactive tracking mode to cause a local LPS measurement device to point in the direction of the IMU or in the direction of another location based on the movement of the IMU. For example, relative mode 530 may include moving the IMU to a start location, at 532. Position and orientation data can be modified at the start location. For example, acceleration and speed data can be generated and an IMU transformation matrix can be adjusted for identity, at 534. [00059] The IMU can be moved by a user while the interactive tracking process is active in the LPS. For example, the user can activate the LPS interactive tracking process by providing an IMU entry (for example, by selecting a button on the IMU). The interactive tracking process can cause the LPS's positioning device to move to simulate the movements of the IMU. The interactive tracking process can include capturing motion data on the IMU and using motion data to compute the location of the IMU (for example, position and orientation) in relation to the starting location, at 536. The method can also include sending the position compensation data to a remote device, such as an LPS component, Petition 870190109803, of 10/29/2019, p. 38/50 35/36 at 538. For example, position compensation data can be sent via wireless transmission. The remote device can use inverse kinematics to calculate the device angles to point the positioning device that is accompanying the IMU. The method may also include sending commands to a positioning system of the local positioning device to cause the positioning device to point based on the movement of the IMU. For example, the remote device can trigger a pan-tilt unit for new angles based on the calculated device angles, at 540. [00060] When the method should continue, at 542, (for example, when the user continues to press a button on the IMU) the method can return to 536 to capture additional motion data and compute the position of the IMU in relation to the location previous. When no additional movements are to be captured, the method ends at 524. [00061] Consequently, the 506 absolute mode can allow the measurement of target locations that are not in line of sight with the LPS. The relative mode 530 can allow tracking of IMU movements by the LPS. Absolute mode 506, relative mode 530, or both, can be used in conjunction with other methods described herein. For example, relative mode 530 can be used to designate a specific component to be measured or to be identified by having the LPS point to the specific component. In another example, absolute mode 506 can be used to determine the position data to be stored in conjunction with other capture data, such as non-destructive test data. [00062] The illustrations of the modalities described here are intended to provide a general understanding of the structure of the various modalities. The illustrations are not intended to serve as a complete description of all Petition 870190109803, of 10/29/2019, p. 39/50 36/36 of the elements and characteristics of the apparatus and systems using the structures or methods described herein. Many other modalities may be apparent to those skilled in the art when examining the description. Other modalities can be used and derived from the description, so that structural and logical substitutions and changes can be made without departing from the scope of the description. For example, method steps can be performed in a different order than shown in the figures, or one or more method steps can be omitted. Consequently, the description and figures should be considered as illustrative rather than restrictive. [00063] Furthermore, although specific modalities have been illustrated and described here, it should be appreciated that any subsequent provision designed to achieve the same or similar results can be substituted for the specific modalities shown. The description is intended to cover any and all subsequent adaptations or variations of various modalities. Combinations of the above modalities, and other modalities not specifically described herein, will be apparent to those skilled in the art when examining the description. [00064] The Description Summary is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the Detailed Description above, several characteristics can be grouped together or described in a single mode for the purpose of simplifying the description. This description should not be interpreted as reflecting an intention that the claimed modalities require more features than are expressly recited in each claim. On the contrary, as the following claims reflect, the subject claimed can be directed to less than all the characteristics of any of the described modalities.
权利要求:
Claims (15) [1] 1. Method, which comprises: initialize (302; 408) a handheld device (120) by storing position and orientation data in a handheld device memory (120), position and orientation data indicating a position and orientation of the handheld device (120) within a local coordinate system associated with a target structure (200); detecting (304; 414) the movement of the portable device (120); determine (306; 416) the movement data based on the detected movement; and determining (308; 418) a spatial relationship of the portable device (120) with respect to the target structure (200) based at least partially on the position and orientation data and the movement data. characterized by the fact that activating (436) a first indicator on the portable device (120) when the spatial relation is within a first limit distance of a target component [2] 2. Method according to claim 1, characterized by the fact that initializing the portable device (120) includes receiving (112) the target component data that specify a location in the local coordinate system of at least one target component of the target structure (200). [3] Method according to claim 1 or 2, characterized by the fact that it still comprises accessing (406) a memory that stores the target component data that specify the locations of the components of the target structure (200) in the coordinate system locations, where the target component data specifies the Petition 870190109803, of 10/29/2019, p. 41/50 2/5 locations of the target structure components (200) based on one or more electronic representations of the target structure (200), in which, preferably, determine (418) the spatial relationship of the portable device (120) in relation to target structure (200) includes determining (420) a position and orientation of the portable device (120) with respect to one or more of the components of the target structure (200). [4] 4. Method according to any of the preceding claims, characterized by the fact that it comprises selecting (434) the first limit distance based on an identification of the target component. [5] 5. Method according to any of the preceding claims, characterized by the fact that it still comprises activating a second indicator on the portable device when the spatial relationship is within a second limit distance of the target component, in which: first limit distance is different from the second limit distance; the first indicator is distinct from the second indicator; and at least one of the first indicator and the second indicator indicates a direction or distance to the target component. [6] 6. Method according to any of the preceding claims, characterized by the fact that it still comprises: receiving (422) an entry in the handheld device that selects a specific position of the handheld device (120); and searching (424) the component data to identify one or more components of the target structure (200) that have a spatial relationship to the specific position. [7] 7. Portable device (120), comprising: a calibration component (126) for communicating with a local positioning system (102) to determine a position Petition 870190109803, of 10/29/2019, p. 42/50 3/5 tion and initial orientation of the handheld device (120) within a local coordinate system associated with a target structure (200); at least one motion sensor (128) for detecting movement of the handheld device (120); a processor (122) for determining a spatial relationship of the portable device (120) to that of the target structure (200) and within it based on the initial position and orientation of the portable device (120) within the local coordinate system and the detected movement of the portable device (120), characterized by the fact that it still comprises a first indicator to be activated when the spatial relationship is within a first limit distance of a target component [8] 8. Portable device according to claim 7, characterized by the fact that it still comprises a memory (124) accessible to the processor (122), the memory (124) storing the target component data (140) that specify the locations of components of the target structure (200) in the local coordinate system; wherein the processor (122) still determines a spatial relationship from the handheld device (120) to at least one of the components based on the measured position and the target component data (140). [9] Portable device according to claim 7 or 8, characterized in that it further comprises an input device (132) for receiving an input that specifies a target component of the components of the target structure (200); where the processor (122) determines information about a spatial relationship between the handheld device (120) and the component Petition 870190109803, of 10/29/2019, p. 43/50 4/5 target member based on measured position and target component data. [10] 10. Portable device according to claim 8 or 9, characterized by the fact that it still comprises a pointer (134) adapted to point to a location of the target component. [11] 11. Handheld device according to any one of claims 8 to 10, characterized in that the processor (122) determines the measured position and orientation of the handheld device (120) when the handheld device is not in line of sight for the local positioning system (102). [12] 12. Portable device according to any one of claims 8 to 11, characterized in that it still comprises a sensor (136) for capturing data in the measured position, in which the processor (122) stores the captured data (144) with the descriptive data of the measured position, wherein the portable device preferably further comprises an interface (130) for sending the descriptive data of the measured position and the captured data to a remote device. [13] 13. Positioning system (102), comprising: a plurality of local position detectors (104) for determining a position and an orientation (152) of a location of interest on a target structure (200) within a first coordinate system; a system processor (106) for determining a second target structure coordinate system (200) based on the position and orientation of the target structure in the first coordinate system and based on at least a portion of one or more representations electronics of the target structure; and a portable device (120), as defined in any Petition 870190109803, of 10/29/2019, p. 44/50 5/5 one of claims 7 to 12, characterized by the fact that it includes: a memory (124) for storing the initial position and orientation of the portable device (120) within the second coordinate system; the processor (122) determining a measured position and orientation of the handheld device (120) within the target structure (200) based on the initial position and orientation of the handheld device (120) within the second coordinate system and based in the detected movement of the portable device (120). [14] System according to claim 13, characterized in that the local position detectors (104) comprise laser devices at known locations within the first coordinate system, laser devices adapted to make measurements of a position and a orientation of the target structure (200) within the first coordinate system. [15] System according to claim 13 or 14, characterized by the fact that it still comprises a calibration table (150) that includes a plurality of target locations (152), in which the measurement data of the laser devices are used to determine the initial position and orientation of the handheld device (120) based on the position of the calibration table (150).
类似技术:
公开号 | 公开日 | 专利标题 BR112012014802B1|2020-04-07|position and orientation determination using motion data US9377301B2|2016-06-28|Mobile field controller for measurement and remote control US7873494B2|2011-01-18|Method and apparatus for an aircraft location position system US9423250B1|2016-08-23|Position measurement correction using loop-closure and movement data JP6884003B2|2021-06-09|Unmanned aerial vehicle tracking equipment, unmanned aerial vehicle tracking methods, unmanned aerial vehicle tracking systems and programs US8340936B2|2012-12-25|Methods and systems for locating targets CA2902597C|2019-06-11|System and method for calculating lever arm values photogrammetrically JP6783681B2|2020-11-11|Arithmetic logic unit, arithmetic method and program US10310080B2|2019-06-04|Three dimensional manufacturing positioning system Heinz et al.2016|Development of a Portable Mobile Laser Scanning System with Special Focus on the System Calibration and Evaluation Ortman et al.2016|Real-time, mixed-mode computing architecture for waveform-resolved lidar systems with total propagated uncertainty Kotzian et al.2011|User perspective adaptation enhancement using autonomous mobile devices Ariante2020|Embedded System for Precision Positioning, Detection, and Avoidance | for Small UAS Farjas Abadía et al.2015|Design and Development of a Low-Cost Aerial Mobile Mapping System for Multi-Purpose Applications. KR20210102181A|2021-08-19|A system for tracking objects in physical space using aligned frames of reference JP7004374B1|2022-02-14|Movement route generation method and program of moving object, management server, management system US11002842B2|2021-05-11|Method and apparatus for determining the location of a static object Wang et al.2021|Simulation Testing Techniques for Autonomous Navigation Based on Multi-source Information Fusion Kim et al.2020|About the methodology and tools for meteo LIDAR metrological support CN109145908A|2019-01-04|Vehicle positioning method, system, device, test equipment and storage medium Perdigão2018|Integration of a Precision Landing System on a Multirotor Vehicle Initial Stages of Implementation Kim et al.2018|Target coordinate system for robust LADAR systems to magnetic field variation bin Ashim et al.2011|Three dimensional asset documentation using terrestrial laser scanner technology Kuscer et al.2010|A portable system for measuring the absolute geographic location of distant objects
同族专利:
公开号 | 公开日 EP2513602B1|2015-01-07| CN102656425B|2016-08-03| WO2011075251A1|2011-06-23| US20110149266A1|2011-06-23| CA2778261C|2015-02-17| US8279412B2|2012-10-02| CA2778261A1|2011-06-23| CN102656425A|2012-09-05| EP2513602A1|2012-10-24|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 JP4612194B2|1998-12-23|2011-01-12|イメージ・ガイディッド・テクノロジーズ・インコーポレイテッド|Hybrid 3D probe tracked by multiple sensors| SE525290C2|2002-12-20|2005-01-25|Trimble Ab|Geodetic measurement / release system and method of using the same| JP4535820B2|2004-09-27|2010-09-01|日立オートモティブシステムズ株式会社|Power steering device| DE102005004155A1|2005-01-28|2006-08-03|Siemens Ag|Movable object e.g. trolley, position determining system for use in field of material monitoring, has relative sensor arrangement to determine object`s position based on object`s absolute position as determined by local positioning radar| US7287701B2|2005-02-17|2007-10-30|The Boeing Company|Handheld coordinate reference system| US20070038409A1|2005-08-02|2007-02-15|Kurt Gilson|Position sensing means and method| US8195343B2|2007-05-19|2012-06-05|Ching-Fang Lin|4D GIS virtual reality for controlling, monitoring and prediction of manned/unmanned system| US8005563B2|2007-10-26|2011-08-23|The Boeing Company|System for assembling aircraft| US8138938B2|2008-10-28|2012-03-20|The Boeing Company|Hand-held positioning interface for spatial query| CN101408438B|2008-11-11|2012-05-30|深圳市凯立德欣软件技术有限公司|Voice prompting method, apparatus and navigation equipment|US9127947B2|2009-05-18|2015-09-08|University Of Utah Research Foundation|State estimator for rejecting noise and tracking and updating bias in inertial sensors and associated methods| US9423250B1|2009-12-17|2016-08-23|The Boeing Company|Position measurement correction using loop-closure and movement data| US9810549B2|2011-01-06|2017-11-07|University Of Utah Research Foundation|Systems, methods, and apparatus for calibration of and three-dimensional tracking of intermittent motion with an inertial measurement unit| US9250213B1|2011-06-14|2016-02-02|The Boeing Company|Ultrasound inspection system for inspecting a test object with non-planar features| US8713998B2|2011-06-14|2014-05-06|The Boeing Company|Autonomous non-destructive evaluation system for aircraft structures| US9879994B2|2011-06-15|2018-01-30|Trimble Inc.|Method of placing a total station in a building| US8817021B1|2011-11-11|2014-08-26|Google Inc.|System for writing, interpreting, and translating three-dimensionalscenes| US10268761B2|2011-12-21|2019-04-23|The Boeing Company|Panoptic visualization document collection| US9524342B2|2011-12-21|2016-12-20|The Boeing Company|Panoptic visualization document navigation| US9104760B2|2011-12-21|2015-08-11|The Boeing Company|Panoptic visualization document database management| CN103200607A|2012-01-09|2013-07-10|电信科学技术研究院|Method and device determining user equipmentpositioning information in minimum drive testprocess| US9310317B2|2012-01-25|2016-04-12|The Boeing Company|Automated system and method for tracking and detecting discrepancies on a target object| US9495476B2|2012-03-23|2016-11-15|The Boeing Company|Panoptic visualization of an illustrated parts catalog| US9612131B2|2012-04-09|2017-04-04|The Boeing Company|Identifying and configuring controls on a control panel| US9058681B2|2012-06-01|2015-06-16|The Boeing Company|Sensor-enhanced localization in virtual and physical environments| US10268662B2|2012-09-10|2019-04-23|The Boeing Company|Panoptic visualization of a document according to the structure thereof| US10275428B2|2012-09-25|2019-04-30|The Boeing Company|Panoptic visualization document differencing| US10824680B2|2012-10-02|2020-11-03|The Boeing Company|Panoptic visualization document access control| US9875220B2|2012-11-09|2018-01-23|The Boeing Company|Panoptic visualization document printing| US9414026B2|2013-01-25|2016-08-09|The Boeing Company|System and method for automated crack inspection and repair| US9665557B2|2013-01-28|2017-05-30|The Boeing Company|Panoptic visualization of elements of a complex system using localization of a point on a physical instance of the complex system| US9734625B2|2013-01-28|2017-08-15|The Boeing Company|Panoptic visualization of a three-dimensional representation of a complex system| US9858245B2|2013-01-28|2018-01-02|The Boeing Company|Panoptic visualization of elements of a complex system using a model viewer| US9470658B2|2013-03-12|2016-10-18|The Boeing Company|Self-contained holonomic tracking method and apparatus for non-destructive inspection| US9223004B2|2013-03-22|2015-12-29|Qualcomm Incorporated|Controlling position uncertainty in a mobile device| US9098593B2|2013-04-23|2015-08-04|The Boeing Company|Barcode access to electronic resources for lifecycle tracking of complex system parts| US8887993B2|2013-04-23|2014-11-18|The Boeing Company|Barcode access to electronic resources for complex system parts| US9043146B2|2013-06-19|2015-05-26|The Boeing Company|Systems and methods for tracking location of movable target object| WO2015000772A1|2013-07-04|2015-01-08|Koninklijke Philips N.V.|Determining orientation| US10310054B2|2014-03-21|2019-06-04|The Boeing Company|Relative object localization process for local positioning system| CN103869285B|2014-04-01|2016-04-06|中国人民解放军信息工程大学|A kind of method and apparatus utilizing visible ray to position| US9841870B2|2014-08-21|2017-12-12|The Boeing Company|Integrated visualization and analysis of a complex system| US9489597B2|2014-08-21|2016-11-08|The Boeing Company|Visualization and analysis of a topical element of a complex system| US10191997B2|2014-08-21|2019-01-29|The Boeing Company|Visualization and diagnostic analysis of interested elements of a complex system| US20160091292A1|2014-09-26|2016-03-31|Xsens Holding B.V.|Inertial Object Dimensioning| US9740191B2|2015-02-12|2017-08-22|The Boeing Company|Location calibration for automated production manufacturing| WO2016151423A1|2015-03-23|2016-09-29|Bombardier Inc.|Simulation of loads on aerostructures during aircraft assembly| RU2600039C1|2015-06-04|2016-10-20|Открытое акционерное общество "Ракетно-космическая корпорация "Энергия" имени С.П. Королева"|Method of determining position of object mainly relative to spacecraft and system therefor| US10591593B2|2016-03-19|2020-03-17|Hipscience, Llc|Point of reference displacement and motion sensor| CN105928505B|2016-04-19|2019-01-29|深圳市神州云海智能科技有限公司|The pose of mobile robot determines method and apparatus| CN105866733A|2016-06-07|2016-08-17|东北大学|Method and device for determining indoor position| CN107063237A|2016-12-14|2017-08-18|歌尔股份有限公司|A kind of method and apparatus for measuring gestures of object angle| US10633066B2|2018-03-27|2020-04-28|The Boeing Company|Apparatus and methods for measuring positions of points on submerged surfaces| US11001496B2|2018-08-10|2021-05-11|Stmicroelectronics S.R.L.|Microelectromechanical systems sensor testing device, system and method| CN112556574A|2020-11-26|2021-03-26|河北工程大学|Water-air cooperative aqueduct crack detection and positioning method|
法律状态:
2019-01-08| B06F| Objections, documents and/or translations needed after an examination request according art. 34 industrial property law| 2019-08-20| B06U| Preliminary requirement: requests with searches performed by other patent offices: suspension of the patent application procedure| 2019-08-27| B06U| Preliminary requirement: requests with searches performed by other patent offices: suspension of the patent application procedure| 2019-10-29| B06I| Technical and formal requirements: publication cancelled|Free format text: ANULADA A PUBLICACAO CODIGO 6.21 NA RPI NO 2538 DE 27/08/2019 POR TER SIDO INDEVIDAMENTE DUPLICADA. | 2020-02-04| B09A| Decision: intention to grant| 2020-04-07| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 12/11/2010, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US12/640,211|US8279412B2|2009-12-17|2009-12-17|Position and orientation determination using movement data| PCT/US2010/056564|WO2011075251A1|2009-12-17|2010-11-12|Position and orientation determination using movement data| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|