专利摘要:
METHODS TO BUILD A CHARACTERISTICS OF A LOCAL MAP, DETERMINE THE POSITION OF A VEHICLE, AND, UPDATE GLOBAL MAP DATAA method and apparatus for sharing map data between industrial vehicles in a physical environment is described. In one embodiment, the method includes processing local map data associated with a large number of industrial vehicles, characterized by the fact that the local map data comprises characteristic information generated by a large number of industrial vehicles considering characteristics observed by industrial vehicles in a large number of vehicles; combine the feature information associated with local map data to generate global map data for the physical environment; and navigate an industrial vehicle from a large number of industrial vehicles using at least a portion of the global map data.
公开号:BR112013030722A2
申请号:R112013030722-6
申请日:2012-06-08
公开日:2020-08-04
发明作者:Lisa Wong;Christopher W. Goode;Andrew Evan Graham
申请人:Crown Equipment Limited;
IPC主号:
专利说明:

"" METHODS FOR BUILDING A LOCAL MAP FEATURE, DETERMINING A VEHICLE POSITION, AND, "UPDATING GLOBAL MAP DATA" i-d
KNOWLEDGE 5 Technical field Modalities of the present invention in general refer to the environment with. based on navigation systems for automated industrial vehicles and, more specifically, the methods and apparatus for sharing map data associated with automated industrial vehicles "10. Description of the Related Art Regulatory entities operate numerous facilities in order to meet to supply and / or demand objectives. For example, small to large corporations, government organizations and /. 15 or similar employ a variety of logistics management and inventory management paradigms to move objects (eg, raw materials, goods, machines and / or the like) in a variety of physical environments (eg warehouses, cold rooms, factories, plants, stores and / or the like). A multinational company can build warehouses in a country 20 to store raw materials for make goods, which are stored in a warehouse and another country for distribution in local retail markets. ". Annazems need to be well organized in order to maintain and / or improve production and sales. If raw materials are not transported to one, 'manufactures at an Optimal rate, few goods are manufactured. As a result,' 25 revenue is not generated for non-manufactured goods to offset the costs -... - of raw materials... "Unfortunately, -. physical environments, such as stoneware, have several limitations that prevent the completion of various tasks in time. Haddock and
Q ./"W 'other spaces for shared use, for example, need to be safe for.
a human workforce. Some. employees operate heavy machinery and industrial vehicles, such as forklifts, which have the potential to cause injury. serious or deadly. However, humans are required for # Y to use industrial vehicles to complete tasks, which include object handling tasks, such as moving pallets of goods to different locations .within a warehouse.- Most warehouses employ a large number of forklift drivers and forklifts to move objects. In order to increase productivity, these warehouses simply add more forklifts and forklift drivers.
- 10 In order to reduce the aforementioned problems, some warehouses use equipment to automate these tasks. As an example, these Amazonians may employ automated industrial vehicles, such as forklifts, to load objects onto paths and then unload those objects at designated locations. When navigating an industrial vehicle, it is necessary to take into account uncertainties and associated noise, - with the sensor measurements. Because the sensors attached to the truck are limited to the specific field of view or distance, the truck cannot extract and process data associated with certain features and landmarks that cannot be observed.
20 Therefore, there is a need in the art for a method and apparatus for sharing map data between these automated industrial vehicles.
SUMMARY Various modalities of the present disclosure generally include a method and apparatus characterized by the fact that it understands the processing of local map data. associated with a large number of industrial vehicles, characterized "by the fact that local map data. comprise feature information generated by a large number of industrial vehicles considering characteristics observed by vehicles
F,, 3 industrialists in a large number of vehicles; combine feature information associated with local map data to generate m - map data
Global M for the physical environment; and navigating an industrial vehicle from a large number of industrial vehicles using at least a portion of the '5 global map.
BRIEF DESCRIPTION OF THE DRAWINGS So that the way in which the features cited "above the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, can be taken by reference to the niodalities, some of which are illustrated in the attached drawings. It is to be noted, however, that the attached drawings illustrate only typical modalities of this invention and are, therefore, not to be considered limiting its scope, for the invention it may admit "other modalities equally effective. , 15 Figure 1 is a perspective view of a physical environment characterized by the fact that it comprises several modalities of the present disclosure; Figure 2 illustrates a perspective view of the forklift to 'navigate in a physical environment to perform various tasks according to one or more modalities; Figure 3 is a structural block diagram of a system for fomecet creating, updating and sharing a map for an industrial vehicle according to one or more modalities; Figure 4 is a functional block diagram of a% 25 system to provide creation, update and map sharing for an industrial vehicle according to one or more modalities; Figure 5 is an "interaction diagram" illustrating a mapping and location process for an industrial vehicle according to one or more modalities;
- ~ "". · —L
4.
4 Figure 6 is a flow diagram of a method of operation of a local map module associated with industrial vehicles according to 7 "in one or more modalities; and! 4 Figure 7 is a flow diagram of a method of operation -5 for a global map module according to one or more modalities.
DETAILED DESCRIPTION Figure '1 "illustrates a schematic perspective view of a physical environment 100 characterized by the fact that it comprises one or more embodiments of the present invention. 10 In some embodiments, the physical environment 100 includes a vehicle 102 that is" coupled to a computer mobile 104, a central computer 106 as well as a sensor array 108. Sensor array 108 includes a large number of devices for analyzing various objects - within physical environment 100 and transmitting data (eg, image data, 15 video data, map data, three-dimensional graph data and / or the like) for the mobile computer 104 and / or the central computer 106, as explained later below. The sensor array 108 includes various types of sensors, such as encoders, ultrasonic distance tracking devices, laser distance tracking devices, 20 l pressure transducers or the like., Physical environment 100 still includes a floor 110 supporting a large number of objects. The large number of objects includes a large number of pallets 112, a large number of units 114 and / or the like as explained later below. The physical environment 100 25 also includes several obstacles (Kião drawn) for the proper operation "of the vehicle-102. Some of a large number of objects can" constitute- obstructions along several paths (i) eg, computed routes dynamically or pre-programmed) if such objects interrupt the completion of the task. For example, an obstacle includes a broken pallet in a
- "" T - "" 5 target destination associated with a transported object cargo. Physical environment 100 also includes a large number of markers 16. The large 'W, ^' # quantity of markers 116 is illustrated as objects attached to the ceiling. In some embodiments, markers 116 may be located on the floor or 5 a combination of the floor and ceiling. In some embodiments, the large,, number of markers 116 are headlights that facilitate navigation based on the environment as explained later below. The large number of markers 116 as well as other objects around the physical environment 100 form reference points defined by environmental characteristics. The mobile computer 104 extracts the environmental characteristics and determines a precise and current vehicle position. Physical environment 100 may include a store or camera. refrigerator to store a large number of 114 units in preparation for future transport. Warehouses can include loading docks for loading and unloading the large number of units coming from. commercial vehicles, railways, airports and / or seaports. The large number of units 114 in general includes various goods, products and / or raw materials and / or the like. For example, the large number of units 114 can be consumer goods that are placed on ISO standard '20' pallets and loaded onto pallet shelves by forklifts to be distributed to retail stores. Vehicle 102 facilitates this. distribution by moving consumer goods to designated locations where commercial vehicles (eg, trucks) load and subsequently deliver consumer goods to one or more target destinations. '25 According to one or more' modes, vehicle 102 can ' be an automated guided vehicle (AGV), such as a automated forklift, which is configured to handle and / or see a large,.
number of units 114 on the floor 110. Vehicle 102 uses one or more lifting elements, such as forks, to lift one or more
'units 114 and then transporting those units 114 along a 3 "path within a transit area 120, (eg, corridor) to be placed in' i an area of free space 122. Alternatively, to one or more units 114 · can be stowed on a '112 pallet from which vehicle 102 lifts and moves 5 to the destination location. Each of a large number of pallets 112 is a flat transport structure that supports goods in a way stable while being lifted by vehicle 102 and / or another lifting device (eg, a pallet jack is / or a font loader).
10 112 is the structural foundation of an object load and allows for efficiencies of, storage and handling. Several pallets of "a lot of.
pallets 112 can be used within a racking system (not designed). Within a given shelf system, rollers or gravity rails allow one or more units 114 on one or more 15 pallets 112 to flow forward. The one or more pallets 112 moves upwards. be slowed or stopped by a delay device, a physical stop or another pallet 112. t .P In some embodiments, the mobile computer 104 and the '' "r · central computer 106 are communication devices that control the.
20 vehicle 102 and perform various tasks within the physical environment 100, The mobile computer 104 is adapted to pair with the vehicle 102 "as illustrated. The mobile computer 104 can also receive and aggregate data (eg, scanner data to laser, image data and / or any 'other related sensor data) that are transmitted by an array of' 25 sensors 108. Various software modules within the mobile computer 104 'control the operation of hardware components associated with the vehicle.
"" 102 as explained later below.
Modes of the invention use and update map information that is used by industrial vehicle 102 to navigate through the
=. W - - [) 7 environment 100 and perform tasks. The map information comprises a local map maintained by the mobile computer 04 and a global map which is maintained by the central computer 106. The local map defines the characteristics in the environment, which are or are close to a particular. Vehicle 102 or 5 comprise characteristics of an area of the environment 100 in which the vehicle is operating or is about to "operate, while a global map defines the entire environment 100. Confonne a vehicle, perform tasks, the location map is' updated by the mobile computer 104 and data is added to the local map "is also used to update the global map, such that the 10 local map information updated by a vehicle is shared with others vehicles 102. The 'map uses and updates processes which are described in detail below., Figure 2 illustrates a perspective view of the forklift 200 to facilitate automation of various tasks within a physical environment according to one or more embodiments of the present invention.
15 Forklift 200 (ie, a lift truck, a high / low, a stacking truck, a trailer loader, side loader or a lifting fork) is an industrial powered truck having various loading capacities and used to lift and carry multiple objects. In some embodiments, forklift 200 is configured to move one or 20 more pallets (eg, pallets 112 in Figure 1) from units (eg, units 114 in Figure 1) along paths within the environment physical '(eg, physical environment 100 in Figure 1). The paths can be predefined or dynamically computed as tasks are received. Forklift 200 can travel within a storage bay 25 that has multiple pallet positions depth to. placing or retrieving a pallet. Often, forklift 200 is guided in the storage bay and places the pallet on arms or suspended rails.
- So the dimensions of forklift 200, including full width and mast width, need to be accurate when determining an associated orientation
. {./ 8 "with a target object and / or destination. Forklift 200 typically includes two or more forks,« ^
K K ', (i.e., protrusions or teeth) for lifting and loading units, within the' physical environment. Alternatively, instead of two or more forks, forklift 200 may include one or more metal posts (not drawn) in order to lift certain units (e.g., rollers, metal spools and / or the like). In one embodiment, forklift 200 includes telescopic forks, powered by hydraulic energy, which allow two- or more pallets to be placed behind each other without a aisle between those 10. pallets. Forklift 200 may further include several mechanical actuators, electrically operated and / or hydraulic portion according to one or more modalities. In some embodiments, forklift 200. - includes "one or more hydraulic actuators (not labeled) that allow 15 lateral and / or rotational movement of two or more forks. In one embodiment, forklift 200 includes a hydraulic actuator (not labeled) to move the forks together and separately. In another embodiment, the - forklift 200 includes a mechanical or hydraulic component to squeeze a unit (eg barrels, rolls of paper and / or the like) to be transported 20. Forklift 200 can be coupled with the mobile computer 104 , which includes the "software to operate the forklift 200" module according to one or more tasks. The forklift 200 is also coupled with a matrix characterized by the fact that it comprises several sensor devices (eg, the sensor matrix 108 of Figure 1), which transmit sensor data (eg, image data, video data, planar distance data and / or three-dimensional graph data) to the mobile computer - 104 to extract information associated withenvironmental characteristics. These devices can be mounted on a forklift 200 in, any
. 9 exterior and / or interior position or mounted in known locations around the physical environment 100. Exemplary modalities of "a forklift 4" .. 200 typically include a camera 202, a 204 planar laser scanner attached to each side and / or an encoder 206 attached to each wheel 208. In another 5 embodiments, forklift 200 includes only the planar laser scanner 204, and encoder 206. Forklift 200 can use any array of sensors with a field of view that extends across a current direction of movement (eg, traveling towards Hente, backwards, moving fork upwards; / downwards, reaching / reaching, and / or the like.) These encoders 10 determine movement data related to the movement of the vehicle. Extremely mounted sensors can include laser scanners or cameras positioned where the .rico data set available from such sensors would improve automated operations. Extensive sensors can include a limited set of transponders and / or other active or passive means' 15 by which an automated vehicle could obtain an approximate position to see a location function. In some embodiments, a number of sensor devices (eg, laser scanners, laser distance tracking devices, encoders, pressure transducers and / or the like) as well as their position on a 200 forklift are vehicle, dependent, and the position in which these sensors are mounted affect the processing of the "laser are measurement" data. For example, by ensuring that all scanners ~ "placed in a measurable position, the sensor array 10" 8 can process the data laser vaning and transfer them to a central point 25 for a forklift 200. In addition, the sensor array 108 can combine multiple laser scans into a single laser scan, which can be used by several software modules to control the forklift
200. Figure 3 is a structural block diagram of a 300 'system
10 k to provide 'precise location for an' industrial vehicle using map data sharing between vehicles according to one or B "more modalities. In some modalities, the system 300 includes an X, - large number of. 104 mobile computers ( illustrated as the 5 mobile computer 1041 ... the mobile computer 104n) associated with a large number of automated vehicles, with the central computer 106 and with the sensor array 108 in which each component is coupled to each other through a network 302. Each of the large number of mobile computers 104 is a type of communication device (eg, a laptop computer, a desktop computer, a Personal Desk Assistant ÇPDA) and the like) comprising -a unit central processing (CPU) 304, several support circuits 306 and memory 308. CPU 304 can comprise 7 "-.
one or more commercially available micro-processors or micro-controllers that facilitate data processing and storage. Various support circuits 306 facilitate operation of the CPU 304 and may include timing circuits, buses, power supplies, input / output circuits and / or the like. The 308 memory includes a read-only memory, random access memory, disk operation mechanism storage, optical storage, removable storage, and the like. Memory 308 includes various data, such as local map data 310, feature information 312, reference point data 314, empty space occupation data 316, position forecasting data. geographic 3 17, geographic position measurement data 318 and a request 319. Memory 308 includes several software packages, such as .Ó.
a navigation module based on the 320 environment and a "local 338 map module. The host computer 106 is a 'type of communication device (eg, a laptop computer, a desktop computer,
server, a Personal Desk Assistant (PDA) and the like) comprising a 322 central processing unit (CPU), various support circuits
W - 324 and a memory 326. The CPU 322 may comprise one or more commercially available micro y processors or micro controllers - which L · 5 facilitate data processing and storage. Various support circuits 324 facilitate operation of CPU 322 and may include timing circuits, buses, power supply, input / output circuits
L output and / or the like. Memory 326 includes read-only memory, random access memory, 10 disk operating mechanism deposit, optical deposit, removable deposit, and the like. Memory 326 includes several software packages, such as a global map module 328, as well as various data, such as a task 330, global map data 334 and a 'path 336. Network 302 comprises a communication system that .
15 connects computers via wire, cable, fiber optic, and / or wireless links facilitated by various types of well-known network elements, such as hubs, switches, routers, and the like. The 302 network can employ several well-known protocols to communicate information between network resources. For example, network 302 can be part of an Intemet or intranet 20 using various communications infrastructure such as Ethemet, Wifi, W1Max, Radio Service via General Package (GPRS), and the like. The sensor array 108 is communicably coupled to the mobile computer 104, which is attached to an automated vehicle, such as a forklift (e.g., forklift 200 of Figure 2). The array of 25 sensors 108 includes a large number of devices 332 to monitor · · a physical environment and capture various data, which are stored by the mobile computer 104 according to the positioning measurement data, geographic 3 18. In some embodiments, the 108 µ sensor array may include any combination of 'one or more laser scanners and / or one or.
: [
h '12 P more cameras. In some embodiments, the large number of devices 332 can be mounted on the automated vehicle. For example, a laser "Scanner and camera may be attached to a lifting frame in a position above the forks. Alternatively, the laser scanner and camera 5 may be located below the forks., In some embodiments, for a given task 330, o., "central computer 106 computes a path 336 to be used by vehicle 102 to complete task 330. Using both local map data 310 and global map data 334 facilitates autonomous guidance and it needs vehicle 10 102 through the environment to complete task 330. In some embodiments, the local map module 338 uses and maintains the local map data 3 10 as well as transmits updates to the global map module 326 from central computer 106. The local map data 310 comprises information of characteristic 312 and point data of »/ 15 reference 3 14. In one embodiment, the characteristic information includes. dynamic and / or static characteristic representing a physical environment close to the vehicle, such as an area of shared use for human workers and automated industrial vehicles. "Static" characteristics represent objects that do not change within the environment, eg 20 walls, storage shelves, and the like. Local map data, 310 can be organized to provide a vector of known landmarks> static and dynamic characteristics In some modalities - information of characteristic 312 includes:. geometry characteristic (line, corner, arc, etc.); a geographical positioning of the characteristic in the global coordinate system 25; and the geographical positioning uncertainty of 'characteristic. Static characteristics represent objects that do not · change ·.
within the environment, eg walls, storage shelves, and the like. Typically the geographic positioning uncertainty for r static features is zero.
4.
In some modalities, the 'dynamic characteristics represent objects that change within the environment, eg. temporary m m obstructions such as broken pallets, objects to be stored, and o, y similar. These features are likely to be stationary for a sufficient amount of time for the system to use them as a location map feature. The system does not contain prior information about the geographic positioning of these characteristics and, therefore, the geographic positioning of these dynamic characteristics "can only be inferred by superimposing the centered measurements of the vehicle from the 10 sensors in the estimated geographic positioning of the vehicle in relation to the system global coordinate Because of the data in the sensor data, as well as the uncertainty in estimating the vehicle's geographic positioning, all dynamic characteristics have geographic positioning uncertainty associated with geographic positioning.
15 In some modalities, the map module 338 stores physical entity within the environment (100 of Figure 1) with a known or estimated geographical position. The reference point data 314 comprises a geometry, a geographical position, and a set of specific features that represent views of the reference point sensor, for example, a wall can be seen by a planar laser scanner as a line, a corridor like two parallel lines, etc. Some of the large number of landmarks include walls, one. pallet or pebble system in which the number of empty spaces are configured to hold objects.
25 'In some modalities, the local map module 338 maintains empty space information (such as a set of geometries and' empty space geographic positions, as part of the reference point data 3 14. Empty spaces are a class of point of virtual reference since its geometry is fixed, however the presence of a pallet in the space
- 14.
empty will indicate whether this reference point and the associated feature information 312 is visible to the vehicle. The empty space occupation data '' 316 indicates the presence of a pallet in the associated empty space and therefore y if the characteristics associated with this pallet form part of the data of '5, local map. Empty spaces may exist on the floor where they can be used for navigation either above the floor or as a stack arrangement or on the shelf system. "In some embodiments, the navigation module based on the 320 environment 'includes software code" (eg, instructions executable by the processor) to determine an accurate vehicle position and update the 310 local map data with new points of interest reference and associated 'portions of the reference point characteristic information 3 12 including its geographical positioning and geographical positioning uncertainty. After the environment-based navigation module 320 processes geographic positioning measurement data 318 from a large number of sensor devices 332, the environment-based navigation module 320 corrects the position forecasting data geographic. L
317. In some embodiments, the 20 geographic positioning forecast data 3 17 include an estimate of the geographic positioning and / or orientation of the vehicle of which the present disclosure may refer as the geographic positioning forecast of the vehicle. The environment-based navigation module 320 can produce such an estimate using a previous geographic positioning in addition to a movimento 25 vehicle movement model. The navigation module based on the 320 'environment can also use a filter process to estimate uncertainty and / or noise for a future forecast of the vehicle's geographic positioning and "" "update steps. Afterwards referencing a map of the physical environment, the navigation module based on environment 320 determines an estimate
"'15 of. A current geographic positioning of the. Vehicle. The uncertainty in the geographic positioning of the vehicle creates an uncertainty in the positioning» · geographic r of observed characteristics. The uncertainty of · geographic positioning in the c.arac.terística information 3 12 is derived from a combination - 5 of vehicle geographic positioning uncertainty and sensor noise.
. The global map module 328 includes software code (eg, processor executable instances) for processing local map data 10 from at least two industrial vehicles and generating the data. global map 334. In some embodiments, global map module 328 '10 defines global map data 334 as a vector of known landmarks vectors, which can be used to construct a vector of known features. These characteristics correspond to - characteristics expected to be extracted from the vehicle sensor. Some of the at least two industrial vehicles - may refer to different coordinate systems 15 for the local map data 310. As such, one or more reference point positions are transformed into positions of a common coordinate system. In some embodiments, the global map module 328 correlates feature information to 3 12 and reference point data 20 314 of the local map data associated with each of the at least two. industrial vehicles combining observed features with known landmarks and / or adding features for new landmarks to global map data 334. Global map data 334 can be used to supplement local map data 310 at a later date '25 with expected characteristics that have not yet been observed by a vehicle. Then, the characteristic information 312 associated with a particular industrial vehicle can be used to provide additional characteristics at reference points or entirely new reference points that have not been observed by another industrial vehicle, such as corners,
'16 walls, objects behind infrastructure} and / or the similar. In one embodiment, the global map module 328 uses. . b statistical methods to estimate the geographic positioning of a new & · feature observed by at least two industrial vehicles evaluating the 5 geographic positioning uncertainty of feature provided by each vehicle: In an alternative mode the global map module 328 can use a filter to combine geographic positioning measurements from. multiple vehicles and develop a derived geographic positioning estimate.
10 In some. modalities, the mobile computer 104. periodically sends a request 3 19 to the central computer 106. In response to request 319, the central computer 106 sends an update to the local map data 3 10. The update may include an update to: the data empty space occupation 3 16, characteristic information 15 3 12 derived from other vehicles, reference point data 314 and the like. Such updates ensure that the mobile computer uses the latest information to navigate the vehicle. In an alternate embodiment, the central computer 106 may periodically publish updates to the data. local map 3 10 and push the update to the mobile computer 20 104. Figure 4 is a functional block diagram of a system 400 to provide precise location for an industrial vehicle according to one or more modalities. System 400 includes the mobile computer 104, which attaches to an industrial vehicle, such as a forklift, as well as a sensor array 108. Various software modules within the mobile computer 104 collectively form a navigation module based on the environment (eg, the environment-based navigation module 320, in Figure 3). The mobile computer 104 includes several software modules.
(i.e., components) to perform navigation functions of the navigation module based on environment 320 of Figure 3, such as a location module 402, a correction module 408, and a controlling vehicle 410.
· In addition, the mobile computer 104 runs the local map module 338. The mobile computer 104 provides precise location for the truck and updates local map data 3 10 with information associated with environmental features. The location module 402 also includes several components, such as a filter 414 and an extraction module of "features 416, to determine a state of the vehicle 418. The module (10 map 404 includes various data, such as dynamic features 422 and
F static characteristics. The map module 404 also includes several components, such as a feature selection module 420 and the reference point expansion module 426. In some embodiments, the sensor data is corrected 15 in the correction module 408 to correct for distortion spatial and / or temporal. The location module 402 processes the corrected data and extracts characteristics from the sensor data using the W extraction component characteristic 416. These characteristics coincide with the characteristics of the local map module 3 3 8, with the incensity of 20 geographic positioning feature and observation noise taken into account, and "Ò vehicle status 418 is then adjusted by filter 414. Extracted features will also be used to update any dynamic features 422 and. / or used to add additional dynamic features. These features will be processed by the module of map 252 along with the uncertainty of geographic positioning of the vehicle at the '' moment of observation, .as well as the observation uncertainty caused by noise in the sensor reading. The map module will be using the, characteristic, your observed geographic positioning uncertainty to "update 310 map data. Vehicle status 418 qu and is modeled by filter 414, if,
'18 q refers to a current vehicle status, and includes geographic positioning information (eg, coordinates) that indicate the position and orientation of the
D "vehicle. Location module 402 communicates data associated with vehicle status 418 to mapping module 404 while also 5 communicates such data to vehicle controller 410. Based on" vehicle position and orientation, the controller of vehicle 410 navigates the truck to a destination. In addition to the filter 414 to calculate the state of the vehicle state 418, the location module 414 also includes the extraction module 10 of characteristic 416 to extract standard characteristics from the corrected sensor data. Map module 338 uses feature selection 420 along with vehicle status 418 to select from the available dynamic features 422 and features. strands 424 available for location module 402 "by eliminating invisible features from feature set 15 422 and 424 thus reducing the number of features that need to be examined by feature extraction module 416. This can be done by splitting the map data into smaller areas, where the partition only contains reference points likely to be observed by the vehicle from the approximate geographical positioning of the vehicle 20. The feature selection module 420 also manages the addition and modification of the dynamic characteristics 422 to the map data 310 The location module 402 and / or map module 338 can - update the map data 310 to indicate areas recently occupied or free of certain features, such as items placed and chosen 25 known. 338 includes a reference point expansion module 426 that processes that reference point data 428 forming part of local map 406 to create the static features 422 and dynamic features 424 that are associated with this reference point. The ex point expansion module. The reference will associate a characteristic uncertainty e ^ 'with the characteristics created according to the uncertainty of the re-reference point. , µp It is appreciated that the 400 system can employ several.
C - 5 communication devices for navigation based on the environment. Q "Any of the 'software modules within the communication device' 104 can be deployed on different or multiple physical hardware components, just like other communication devices. The local map module 338, for example, can run on a computer 10 servers (eg, the computer 106 in Figure 1) over a network (eg, network 302 in Figure 3) to connect with multiple mobile communication devices for the purpose of sharing and updating map data 3 10 with a position and orientation of the current vehicle In some modalities, the local map module 338 15 processes characteristics observed from the industrial vehicle and generates (or improves) the local map data 3 10 (ie, niapa data previously Essentially, map module 338 on mobile computer 104 updates dynamic features 422 as well as map data 310 with features and landmarks that have been observed by vehicle 20 ind ustrial to which the mobile computer 104 is associated. Observed Characteristics (and / or 'reference points') are transmitted to central computer 106 for inclusion in the global map data. As such, the local map data for other vehicles will be updated with information from the global map data. Consequently, disparate vehicles share their 25 characteristics and / or observed reference points. The controller. vehicle 410, therefore, can now navigate the industrial vehicle 'associated with the mobile computer 104 using features and / or points. reference points that are observed by another vehicle without first observing the reference points.
.
'20 Figure 5 is an EBN structure diagram illustrating elements and interfaces for a 500 to 1 mapping and location system *
W industrial vehicle according to one or more modalities. Specifically, this mapping and location process 500 includes processing and communicating, 5 various data between components or layers, such as correction of sensor data 502, an interface 504, extraction of "feature 507, data association 508, EKF 510 and map dynamic 512. The mapping and location process 500 supports industrial vehicle operation using primarily environmental features. Interface 504 provides control over layers 10 and is added to an environment-based navigation module. Feature extraction 506 examines input data by the sensor devices and extracts observed characteristics (eg lines and corners) - The 508 data association compares the observed characteristics with known characteristic information to identify 15 matching characteristics with static and / or dynamic map data 424, 422. The EKF 0 710 is an Extended Kalman filter that, given the measurements associated with the characteristics, matches previous geographic positioning of vehicle I, provides a more likely geographical positioning of the current vehicle. The map manager 5 12 maintains a dynamic map of .20 updated features used for localization that are not found on a static map a priori. These updates to the dynamic map data are sent to the central computer for inclusion in the global map data. The correction of sensor data 502 is a step in the process of 25 location 514 where motion artifacts are removed from the sensor data before a prediction of the geographic positioning of the vehicle according to some modalities. Sensor data correction 502 uses vehicle motion data, which is acquired from various sensor data and then modifies the sensor data that can be affected by movement.
y '-,
21 of the vehicle before its data is communicated to interface 504. For example, the correction of sensor data 502 uses a wheel diameter and encoder data to compute speed measurements.
A change in
7 - geographic positioning of the vehicle causes motion artifacts in 5 subsequent laser scanner data.
Consequently, the coreration of sensor data 502 modifies the laser scanner data before invoking the EKF 510 via the 504 interface. The EKF 5 10, in response, performs a geographic position forecast in order to estimate current position data with based on vehicle movement data.
The EKF 510 corrects the current 10 geographic positioning data estimated in response to the laser scanner data.
Via the 504 interface, the corrected current position data is' communicated back to the vehicle.
Figure 6 is a flow diagram of a 600 method of
, operation for the local map module according to one or more "15 modalities.
In some embodiments, a map module (eg, the 'local map module 338 in Figure 3) performs each step of method 600. The map module can be a component of a navigation module based on the environment (eg ., the navigation module based on environment 320 in Figure 3). In other modalities, some steps are omitted or skipped.
Method 600 starts at step 602 and proceeds to step 604. In step 604, the
. Method 600 processes map data (e.g., the local map data 310 of Figure 3) associated with one or more industrial vehicles.
Map data provides locations and / or dimensions for one or more landmarks within a physical environment.
In some modalities, the 600 method
25 stores the map data on a mobile computer (e.g., the mobile computer 104 of Figures 1 and 4) coupled to a particular industrial vehicle. as local map data (eg, map data 30 in Figure 4). In other modalities, method 600 stores the map data in a '. host computer (eg, host computer 106 in Figures 1. and 4)
and as global map data (e.g., global map data 334 in Figure 3). The local map data may initially comprise map data * »extracted from the global map data by the central computer and sent to.
·. the mobile computer. The extraction of local map data is based on the vehicle's current position in the environment. The central computer can send updates to the local map data to periodically provide a local map that contains the knowledge acquired from other vehicles. One step processes the local map data to deliver to the location module a set of features that are currently visible from the geographic positioning of the currently estimated vehicle. The step expands the reference points (eg reference points 428 in Figure 4) to create a feature set (eg, static features 420 and dynamic features 424 in Figure 4) and applies geometry to the set of local characteristics that represent the area visible to the vehicle's sensors 15 from the currently estimated geographical position. The location module will provide a set of faces observed with their geometry, position and uncertainty of geographic positioning. In step 606, method 600 evaluates the observed characteristics to identify whether or new information of characteristics or in 20 some modalities if new observations of characteristics with a time limit were received. If there are now new features, the method proceeds to step 614 At step 608, method 600 processes -new features against reference point information (eg, point information of 25, reference 3'28 in Figure 3) associated with one or more industrial vehicles. These features can be processed using polygon matching or other standard matching techniques. The patterns to be matched are selected from the geometries of entities known in the environment, such as pallets, loads, etc. Some observations that cannot be -
matches are ignored. In step 610, method 600 creates points of. reference or update existing reference points for features that.
N * are coincident. Multiple features can be combined to define a reference point. Landmarks represent physical attributes 5 of the environment that can be used for navigation. Method 600 adds'. 'new reference points for the map data as reference point data 314 of Figure 3. In step 612, method 600 generates local map data for the physical environment (ie, "a shared used space). map data "- 10 include the feature information associated with the disparate features that form the given reference point. In other words, the local map data includes features that are observed by the truck. These reference point data (characterized by understanding the characteristics) are transmitted to the central computer. iS for inclusion in, global map data. . In step 614, method 600 decides whether to update the global map with new landmark information. Updates. can be generated whenever a new landmark is identified, or can be updated periodically. In addition, the method can generate a 20 request to update the local map to add new features. observed by other industrial vehicles. In step 616, method 600 packs the new waypoint information in a form suitable for updating the global map. Each new reference point includes a position and uncertainty. The uncertainty is dependent on the uncertainty of the feature observations and the number of features that match the geometry of the reference point. Thus, the uncertainty includes an uncertainty of geographic positioning and an uncertainty of identification. In step 618, method 600 ends.
Figure 7 is a flow diagram of a 700 method. operation of the global map module 'according to one or more "modalities. In some modalities, the global map module" (eg, the 2 "global map module 328 in Figure 3) performs each step of the method 700, 5 In other modes, some steps are skipped or skipped: Method 700 starts at step 702 and proceeds to step 704. In step 704, method 700 receives a vehicle's local map data on a large number of vehicles The local map data comprises at least one reference point that 10 previously was not recognized as contained within the local map of the 'sending vehicle. In step 706 the method processes the new reference points. In one mode the process will combine the reference point data received from multiple industrial vehicles evaluating each new reference point for similarity with the previously reported reference points 15, the method adds to existing reference points new features o! already served by independent vehicles and
Ç reduces the uncertainty of geographic positioning and identification associated with the reference point. In an alternative modality, the stage implements a filter process that statistically combines observations from multiple 20 industrial vehicles, and only generates new points. reference, for the global map when sufficient observations were "received to reduce uncertainty to an acceptable limit. In one embodiment, new characteristics are assigned a limit value that requires that the characteristics are continuously observed to be maintained on the global map.
25 In step 708, method 700 updates global map data on the central computer with the characteristic information. In an Inodality 'the step will include assessing the limit value of cmaterials on the map and removing,' those landmarks that have not been observed recently. Then, the global map data includes observed characteristics of another vehicle
! P '25 6 industrial which facilitates navigation of the industrial vehicle. Since all industrial vehicles report to the 700 method in this way, the gjobal map
W contains the observed characteristics of all industrial vehicles.
. In step 710, method 700 determines whether a request (eg "5 eg, request 319 in Figure 3) for" map data "is received from a vehicle mobile computer. If the request was received, method 700 proceeds to step 714. If, on the other hand, the request for map data has not been received, method 700 waits until a request arrives and resumes to step 704. In step 714, method 700 determines a current vehicle position 10 for the requesting vehicle. In some modalities, method 700 extracts the current vehicle position from the current vehicle from the request. In other modalities, the central computer can know the vehicle's position via location reporting. In step 716, method 700 identifies a corresponding portion of the global map data. In step lS 718, method 700 communicates the corresponding portion of the global map as a local map for the truck. In response to the request, method 700 identifies map data for a subarea of the global map and communicates the local map data to the mobile computer in the industrial vehicle according to one or more modalities. The new local map data includes all 20 features recently acquired by other vehicles considering features close to the position of vehicles being requested. In step 720, method 700 ends. In other modalities, the central computer is aware of the positions of all vehicles. As such, the central computer 25 can automatically extract data from the local map and send it to the vehicles, i.e., without a request being received. The process can be carried out either periodically or when a vehicle enters a region that. previously received the update of feature information from another vehicle.
¥ ~
26
In an additional modality, 'local maps can be sent continuously to vehicles as they progress through' an environment.
Local maps can overlap such that an earlier map
- can be used as the next map is being received.
The sequence 5 of local maps can be sent to the vehicle, where the sequence includes a path being crossed by a particular vehicle.
Through the use of a global map that is updated using local map data 'produced by various vehicles, vehicles can improve the environment to add navigation landmarks, for example
10 eg, position an object for use as a 'reference point to improve navigation, and then share the feature information with other vehicles via the global map.
In other modalities, obstructions become dynamic feature entries in the local map data such that knowledge of an obstruction is shared by vehicles through the
15 global map.
To improve the use of features, some modalities can "limit" certain types of features, ie, a static feature that is repeatedly identified by vehicles can form a permanent reference point on the global map, while the dynamic features that just appear on the global map can be assigned a value
20 limit when they will be removed from the global map unless renewed through being observed by another vehicle.
Various elements, devices, and modules are described above in association with their respective functions.
These elements, devices, and modules are considered as means to carry out their respective functions as
25 described here.
While the precedent is directed "to the modalities of
, ,, "the present invention, other and additional embodiments of the invention can be bypassed without departing from its basic scope, and the scope of it is determined by the claims that follow.
权利要求:
Claims (28)
[1]
1. Method to build a feature of a local map .. µ 'with a centered measurement of the vehicle, the method characterized by the fact that 3 comprises: .. 5 providing geographic positioning indicator of the position of an automated guided vehicle with respect the global coordinate system of global map data, in which the automated guided vehicle comprises one or more sensors; providing local map data in a memory associated with the automated guided vehicle, eni that the local map data comprises feature information and reference point data; capture, with one or more sensors, measurement data "centered on the vehicle indicative of one or more objects in the physical environment, those objects comprising an object that changes in the physical environment; 15 automatically determine, with at least one processor, a geographic positioning of dynamic characteristic indicative of one.
object position, object orientation, or both with respect to the global coordinate system by overlapping the measurement data: r centered of the vehicle on the global position of the vehicle; and 20 update the feature information of the 'local map with the geographic positioning of dynamic features.
[2]
2. Method, according to claim 1, characterized by the fact that the automated guided vehicle comprises a transponder, and in which the geographical positioning of the vehicle is provided via 25 transponders.
[3]
3. Method, according to claim 1, characterized by the fact that it still comprises: transforming the centered measurement data of the vehicle into a set of central characteristics of the vehicle; and ;
t W * "2 determine the geographic position of the vehicle" based on the set of central characteristics of the vehicle. % ã ~
[4]
4. Method, according to claim 3, characterized: "by the fact that it still comprises determining the uncertainty of dynamic geographic positioning indicative of error of one or more sensors, of the geographic positioning of the vehicle, or both.
[5]
5. Method, according to claim 3, characterized by the fact that it still comprises the communication of the dynamic geographic positioning with a remote location computer of the automated guided vehicle 10.
[6]
6. Method, according to claim 1, characterized by the fact that it still comprises: receiving a new feature, a new reference point or both from the global map data, in which the global map data is stored in a computer centered remote location of the automated guided vehicle; and update the feature information for local map data, landmark data for local map data, or both, according to the new feature for local map data, the new 20 reference points for global map data, or both .
[7]
7. Method according to claim 1, characterized by the fact that the reference point data comprises data of empty space occupation indicative of the presence of a pallet, and the method further comprises storing a characteristic of the pallet indicative of the pallet 25 in memory such that the characteristic of the pallet is combined with the characteristic information of the local map data.
-
[8]
8. Method, according to claim 3, characterized by the fact that it comprises: determining an invisible portion of the information of
- Q 3 characteristics, data from reference points, or both based on the geographic positioning of the vehicle and in an observable area of one or more sensors; and
P 0 '- sharing local map data on shared map data 5, where shared map data does not include the invisible portion.
'
[9]
9. Method, 'according to claim 8, characterized by the fact that it still comprises: transforming the centered measurement data of the vehicle into a 10' set of central characteristics of the vehicle; and determining the next geographic positioning of a vehicle that indicates a new position of the self-guided guided vehicle with "in relation to the global coordinate system based on the set of central characteristics of the vehicle and on shared map data.
15
[10]
10. Method according to claim 8, characterized by the fact that one or more sensors comprise a camera.
[11]
ll. Method according to claim 8, characterized in that one or more sensors comprise a planar laser scanner coupled to one or more sides of the automated guided vehicle.
20
[12]
Method according to claim 8, characterized in that one or more sensors comprise an encoder attached to a wheel of the guided vehicle. automated.
[13]
13. Method according to claim 8, characterized in that the automated guided vehicle comprises a 25 'forklift.
[14]
14. Method according to claim 8, characterized by the fact that the physical environment comprises an Amazon or a Mgorific.
[15]
15. Method for determining the position of a vehicle with the
"" "4 centered vehicle measurement, characterized by the fact that it comprises: providing geographic positioning indicating the position of
N an automated guided vehicle with respect to the coordinate system
W: global global map data, wherein the automated guided vehicle 5 comprises one or more sensors; provide Iocal map data in an associated memory
Ç with the automated guided vehicle, where the local map data comprises feature information and reference point data; provide an observable area indicative of visible range of 10 one or more sensors of the automated guided vehicle; determining an invisible portion of the reference point data based on the geographic positioning of the vehicle and the observable area of one or more sensors, where the 'invisible portion is outside the observable area; 15 share local map data on shared map data, where shared map data does not include the invisible portion. capture vehicle-centered measurement data with one or more sensors, where vehicle-centered measurement is indicative of one, 20 object in the physical environment; transform vehicle-centric measurement data into a set of central vehicle features; and ¶ automatically determine, with at least one processor, the next geographic positioning of a vehicle that indicates a new position of the automated guided vehicle with respect to the global coordinate system based on the vehicle's central feature set and map data " shared.
[16]
16. Method according to claim 15, characterized by the fact that the automated guided vehicle comprises a transponder,
t. - - e, k .4 5 and where the geographic positioning of the vehicle is provided by that transponder.
Ú =.
[17]
17. Method, according to claim 15, characterized by the fact that it comprises: '5' transform the -. vehicle-centered measurement data in one. geographic positioning characteristic indicative of an object position, an object orientation, or both with respect to the system, global coordinates and a geographic positioning characteristic indicative of measurement data error; 10 "communicating the geographic positioning characteristics, the geographic positioning characteristics uncertainty, or both to a central computer located remotely from the automated guided vehicle; and updating the global map data according to the 15 geographic positioning characteristics, the uncertain. eza of geographic positioning characteristics, or both, in which global map data is stored in the memory associated with the central computer.
[18]
18. Method according to claim 15, characterized by the fact that the reference point data comprises the '20 occupation of empty space data that indicate the presence of a pallet.
[19]
19. Method according to claim 18, characterized. due to the fact that it still comprises storage, when empty space occupation data indicates a pallet, a pallet characteristic that indicates a pallet in a memory coupled to the automated guided vehicle such that the pallet characteristic is associated with the information characteristic local map data.
[20]
20. Method according to claim 15, characterized by the fact that the object changes in the physical environment.
[21]
21. The method of claim 20, characterized,
"r is; t- 6. by the fact that the object is a pallet.
[22]
22. Method according to claim 20, characterized by the fact that it still comprises: ¥ overlaying the centered measurement data of the vehicle to the '5 geographic position of the vehicle or the next geographic position of the vehicle to create a result overlaid; and detemiinar, based on the 'superimposed result, a geographic positioning of dynamic characteristic that indicates an object position, an object orientation, or both with respect to the system of 10 global coordinates.
[23]
23. Method for updating global inapas data with the centered measurement of a vehicle, characterized by the fact that it comprises: navigating a first automated guided vehicle through a physical environment using global map data that defines a global coordinate system; capture first measurement data centered on a vehicle with a first series of sensors coupled to the first automated guided vehicle, where the first measurement data c.entry of the vehicle indicate an object in the physical environment; 20 transform the first vehicle-centered measurement data into a first geographic positioning of characteristics that indicates the position of an object, the orientation of an object, or both with respect to the global coordinate system and a 'first geographic positioning uncertainty characteristic that indicates error of the first Qs. · 25 measurement data; capture .follows-centered measurement data from a vehicle with a second series of sensors coupled to the second automated guided vehicle, where the second centered measurement data from the vehicle indicates an object in the physical environment;
transform the second vehicle-centered measurement data, of characteristics that indicates in a second geographic positioning «" position of an object, the orientation of an object, or both with respect to the g global coordinate system and a second positioning uncertainty 5 feature geographic indicating error of the second measurement data; traI1sforInar, automatically with one or more processors, the first feature geographic positioning, the first feature geographic positioning uncertainty, the second feature geographic positioning, the second positioning uncertainty 10 feature geographic in a geographic positioning of estimated feature that indicates the object's position, object orientation, or even with respect to the global coordinate system, where the first feature geographic placement, the first feature geographic position uncertainty , the second position geographic feature feature 15, and the second feature geographic positioning uncertainty are transformed into the feature feature geographic estimated according to one or more statistical methods; and update the global map data according to the geographic positioning of the estimated feature.
20
[24]
24. The method of claim 23, characterized by the fact that it further comprises: determining the geographical position of a vehicle that indicates the position of a first automated guided vehicle with respect to the '' global coordinate system; 25 provide local map data in a memory associated with the first automated guided vehicle, in which "local map data comprises feature information and reference point data; detects an invisible portion of information from
F
E 8 characteristics, data from reference points, or both based on the geographic positioning of the vehicle and the observable area of the first series - »" of sensors, in which the invisible portion is outside the observable area; 6 - share the data of local maps in shared map data, 5 where shared map data does not include the invisible portion.
[25]
25., Method according to claim 23, characterized by the fact that it further comprises: providing local map data in a memory associated with the first automated guided vehicle, in which the local map data comprises feature and data information of reference points, and where the data of reference points comprise data of occupation of empty space that indicate the presence of a pallet; .
store, when the empty space data 15 indicates a pallet, a pallet characteristic that indicates a pallet in memory such that the pallet characteristic is associated with the information characteristic of the local map data.
[26]
26. "Method according to claim 23, characterized by the fact that it further comprises: 20 determining the geographic positioning of a vehicle which - indicates the position of the first automated guided vehicle with respect to the global coordinate system, an orientation of the first automated guided vehicle with respect to the global coordinate system, or both, in which the first centered measurement data of the vehicle is transformed into the first geographical position by overlapping the centered measurement data of the vehicle with the geographic positioning of the vehicle. ",
[27]
27. Method according to claim 23, characterized by the fact that the object changes according to the physical environment.
[28]
28. The method of claim 23, characterized
Ç ba
9 due to the fact that it still seeks to eliminate the geographic positioning of estimated characteristics from global map data after a period of a «
time. . Z 'Y: 1 -
d
.
类似技术:
公开号 | 公开日 | 专利标题
BR112013030722A2|2020-08-04|methods for building a feature on a local map, determining the position of a vehicle, and updating global map data
US10611613B2|2020-04-07|Systems and methods for pose development using retrieved position of a pallet or product load to be picked up
EP2721373B1|2018-08-08|Method for facilitating map data processing for industrial vehicle navigation
AU2012304464B2|2015-05-21|Method and apparatus for using pre-positioned objects to localize an industrial vehicle
US9056754B2|2015-06-16|Method and apparatus for using pre-positioned objects to localize an industrial vehicle
EP2863284B1|2020-10-07|Method and apparatus for using unique landmarks to locate industrial vehicles at start-up
US20120303255A1|2012-11-29|Method and apparatus for providing accurate localization for an industrial vehicle
同族专利:
公开号 | 公开日
CA2834932C|2014-08-12|
RU2587641C2|2016-06-20|
CA2834932A1|2012-12-20|
AU2012269800B2|2014-05-29|
AU2012269800A1|2013-05-02|
KR101663977B1|2016-10-12|
US20120323431A1|2012-12-20|
US8594923B2|2013-11-26|
KR101589943B1|2016-02-12|
EP2721374B1|2016-08-10|
EP2721374A1|2014-04-23|
KR20150132603A|2015-11-25|
WO2012173497A1|2012-12-20|
KR20140020331A|2014-02-18|
CN103582803B|2017-04-19|
RU2013154046A|2015-07-20|
CN103582803A|2014-02-12|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

JPS6067818A|1983-09-22|1985-04-18|Hitachi Ltd|On-vehicle navigator|
US4855915A|1987-03-13|1989-08-08|Dallaire Rodney J|Autoguided vehicle using reflective materials|
US4858132A|1987-09-11|1989-08-15|Ndc Technologies, Inc.|Optical navigation system for an automatic guided vehicle, and method|
US5011358A|1988-10-25|1991-04-30|Andersen Eric T|Height indicator for a fork lift truck|
US5179329A|1989-04-25|1993-01-12|Shinko Electric Co., Ltd.|Travel control method, travel control device, and mobile robot for mobile robot systems|
US5051906A|1989-06-07|1991-09-24|Transitions Research Corporation|Mobile robot navigation employing retroreflective ceiling features|
US5646845A|1990-02-05|1997-07-08|Caterpillar Inc.|System and method for controlling an autonomously navigated vehicle|
US5170352A|1990-05-07|1992-12-08|Fmc Corporation|Multi-purpose autonomous vehicle with path plotting|
US5202832A|1991-01-29|1993-04-13|R. R. Donnelley & Sons Co.|Material handling automation system using portable transfer module|
US5491670A|1993-01-21|1996-02-13|Weber; T. Jerome|System and method for sonic positioning|
US5539638A|1993-08-05|1996-07-23|Pavilion Technologies, Inc.|Virtual emissions monitor for automobile|
US5471393A|1994-01-26|1995-11-28|Bolger; Joe|Driver's associate: a system for vehicle navigation and driving assistance|
US5961571A|1994-12-27|1999-10-05|Siemens Corporated Research, Inc|Method and apparatus for automatically tracking the location of vehicles|
US5916285A|1995-10-18|1999-06-29|Jervis B. Webb Company|Method and apparatus for sensing forward, reverse and lateral motion of a driverless vehicle|
DE19613386A1|1996-04-03|1997-10-09|Fiat Om Carrelli Elevatori|Industrial truck, which can be operated either manually or automatically|
SE9601440D0|1996-04-15|1996-04-15|Apogeum Ab|Method for determining a plurality of fixed objects|
US6092010A|1997-09-03|2000-07-18|Jervis B. Webb Company|Method and system for describing, generating and checking non-wire guidepaths for automatic guided vehicles|
SE511504C2|1997-10-17|1999-10-11|Apogeum Ab|Method and apparatus for associating anonymous reflectors to detected angular positions|
DE19757333C1|1997-12-22|1999-09-16|Litef Gmbh|Automatic, fast calibration of an on-board measurement of a speed vector|
JPH11296229A|1998-02-13|1999-10-29|Komatsu Ltd|Vehicle guide device|
US6073076A|1998-03-27|2000-06-06|Navigation Technologies Corporation|Memory management for navigation system|
JP3316842B2|1998-08-06|2002-08-19|村田機械株式会社|Automatic guided vehicle system and automatic guided vehicle guidance method|
JP3316841B2|1998-08-06|2002-08-19|村田機械株式会社|Automatic guided vehicle system|
JP3918326B2|1998-10-26|2007-05-23|株式会社デンソー|Route setting device and navigation device|
US7123166B1|2000-11-17|2006-10-17|Haynes Michael N|Method for managing a parking lot|
JP2002048579A|2000-04-28|2002-02-15|Matsushita Electric Ind Co Ltd|Communication type navigation system|
US6917839B2|2000-06-09|2005-07-12|Intellectual Assets Llc|Surveillance system and method having an operating mode partitioned fault classification model|
JP2002108446A|2000-09-29|2002-04-10|Nippon Seiki Co Ltd|Method for guiding traveling object|
US6428439B1|2000-10-04|2002-08-06|Gkn Automotive, Inc.|Integrated viscous transmission in a differential|
US6952488B2|2001-08-27|2005-10-04|Carnegie Mellon University|System and method for object localization|
RU2212113C2|2001-09-07|2003-09-10|Марат Саитович Тугушев|Method for automatic data communications using coded card and coded card for implementing it|
JP3968501B2|2001-11-30|2007-08-29|ソニー株式会社|Robot self-position identification system and self-position identification method|
US7844364B2|2002-04-16|2010-11-30|Irobot Corporation|Systems and methods for dispersing and clustering a plurality of robotic devices|
WO2003093768A1|2002-04-30|2003-11-13|Telmap Ltd.|Navigation system using corridor maps|
DE10220936A1|2002-05-10|2003-12-04|Siemens Ag|Localization device with fixed and / or variable landmarks|
US20050149256A1|2002-05-10|2005-07-07|Siemens Aktiengesellschaft|Device for determining the position by means of fixed and/or variable landmarks|
DE10230479A1|2002-07-06|2004-01-15|Robert Bosch Gmbh|Driver information device|
GB2389947B|2002-07-25|2004-06-02|Golden River Traffic Ltd|Automatic validation of sensing devices|
DE10234730A1|2002-07-30|2004-02-19|Josef Schreiner|Position determination method for use with industrial trucks, e.g. forklift trucks, within a defined area, wherein the positions of transport and reference fixed objects are known and truck positions are determined from them|
AU2003300959A1|2002-12-17|2004-07-22|Evolution Robotics, Inc.|Systems and methods for visual simultaneous localization and mapping|
KR100506533B1|2003-01-11|2005-08-05|삼성전자주식회사|Mobile robot and autonomic traveling system and method thereof|
US7221928B2|2003-10-01|2007-05-22|Laird Mark D|Mobile emergency notification system|
JP4409904B2|2003-10-08|2010-02-03|株式会社日立製作所|Route information providing system and route information providing method|
KR100982058B1|2003-10-20|2010-09-13|엘지전자 주식회사|Method for managing map data of a vehicle|
US7689321B2|2004-02-13|2010-03-30|Evolution Robotics, Inc.|Robust sensor fusion for mapping and localization in a simultaneous localization and mapping system|
KR20060130715A|2004-03-03|2006-12-19|닛본 덴끼 가부시끼가이샤|Positioning system, positioning method, and program thereof|
KR100571837B1|2004-03-05|2006-04-17|삼성전자주식회사|Method and apparatus for controlling navigation in automatic navigation vehicle|
US7246007B2|2004-03-24|2007-07-17|General Motors Corporation|System and method of communicating traffic information|
DE112005000738T5|2004-03-29|2007-04-26|Evolution Robotics, Inc., Pasadena|Method and device for determining position using reflected light sources|
US7148458B2|2004-03-29|2006-12-12|Evolution Robotics, Inc.|Circuit for estimating position and orientation of a mobile object|
KR100703692B1|2004-11-03|2007-04-05|삼성전자주식회사|System, apparatus and method for improving readability of a map representing objects in space|
US20100222925A1|2004-12-03|2010-09-02|Takashi Anezaki|Robot control apparatus|
US7845560B2|2004-12-14|2010-12-07|Sky-Trax Incorporated|Method and apparatus for determining position and rotational orientation of an object|
US8381982B2|2005-12-03|2013-02-26|Sky-Trax, Inc.|Method and apparatus for managing and controlling manned and automated utility vehicles|
US8497761B2|2005-01-13|2013-07-30|Rite-Hite Holding Corporation|System and method for remotely controlling docking station components|
US7555363B2|2005-09-02|2009-06-30|Neato Robotics, Inc.|Multi-function robotic device|
GB0520576D0|2005-10-10|2005-11-16|Applied Generics Ltd|Using traffic monitoring information to provide better driver route planning|
WO2007050406A1|2005-10-21|2007-05-03|Deere & Company|Networked multi-role robotic vehicle|
US7634336B2|2005-12-08|2009-12-15|Electronics And Telecommunications Research Institute|Localization system and method of mobile robot based on camera and landmarks|
US7616642B2|2006-01-04|2009-11-10|Sap Ag|Priority assignment and transmission of sensor data|
US8050863B2|2006-03-16|2011-11-01|Gray & Company, Inc.|Navigation and control system for autonomous vehicles|
US7646336B2|2006-03-24|2010-01-12|Containertrac, Inc.|Automated asset positioning for location and inventory tracking using multiple positioning techniques|
JP4763537B2|2006-07-13|2011-08-31|株式会社デンソー|Driving support information notification device|
WO2008072412A1|2006-12-11|2008-06-19|Mitsubishi Electric Corporation|Navigation apparatus|
CN101246476A|2007-02-15|2008-08-20|北京腾瑞万里信息技术有限公司|Map data downloading method, downloading device and server|
JP4975503B2|2007-04-06|2012-07-11|本田技研工業株式会社|Legged mobile robot|
JP4328813B2|2007-04-06|2009-09-09|本田技研工業株式会社|MOBILE DEVICE, ITS CONTROL METHOD AND CONTROL PROGRAM|
DE102007021693A1|2007-05-09|2008-11-13|Götting jun., Hans-Heinrich, Dipl.-Ing. |Auxiliary system for determining position of fork-lift truck in parking space, has unit for coupling and/or fixing position of vehicle, where determined position is corrected by relative positioning of vehicle relative to object|
JP5047709B2|2007-07-04|2012-10-10|株式会社日立製作所|Moving device, system, moving method, and moving program|
WO2009007983A2|2007-07-12|2009-01-15|Carmel - Haifa University Economic Corp Ltd.|Localization method for mobile robots based on landmarks|
US20090140887A1|2007-11-29|2009-06-04|Breed David S|Mapping Techniques Using Probe Vehicles|
US20090216438A1|2008-02-21|2009-08-27|Microsoft Corporation|Facility map framework|
US8126642B2|2008-10-24|2012-02-28|Gray & Company, Inc.|Control and systems for autonomously driven vehicles|
JP4655139B2|2008-11-19|2011-03-23|トヨタ自動車株式会社|Mobile positioning device|
KR101214143B1|2008-12-22|2012-12-20|한국전자통신연구원|Method and apparatus for detecting position and orientation|
US8930023B2|2009-11-06|2015-01-06|Irobot Corporation|Localization by learning of wave-signal distributions|
EP2506106B1|2009-11-27|2019-03-20|Toyota Jidosha Kabushiki Kaisha|Autonomous moving object and control method|
US20110153338A1|2009-12-17|2011-06-23|Noel Wayne Anderson|System and method for deploying portable landmarks|
US8340438B2|2009-12-17|2012-12-25|Deere & Company|Automated tagging for landmark identification|
US20120101784A1|2010-10-25|2012-04-26|Trimble Navigation Limited|Wide-area agricultural monitoring and prediction|
EP2668623A2|2011-01-24|2013-12-04|Sky-Trax, Inc.|Inferential load tracking|
US8594923B2|2011-06-14|2013-11-26|Crown Equipment Limited|Method and apparatus for sharing map data associated with automated industrial vehicles|US8902080B2|2009-04-07|2014-12-02|Mitsubishi Electric Corporation|Vehicle-mounted narrow-band wireless communication apparatus and roadside-to-vehicle narrow-band wireless communication system|
US8508590B2|2010-03-02|2013-08-13|Crown Equipment Limited|Method and apparatus for simulating a physical environment to facilitate vehicle operation and task completion|
US8538577B2|2010-03-05|2013-09-17|Crown Equipment Limited|Method and apparatus for sensing object load engagement, transportation and disengagement by automated vehicles|
CN103608740B|2011-04-11|2017-06-30|克朗设备公司|The method and apparatus that multiple automatic incomplete vehicles are effectively dispatched using coordinated path planner|
US8655588B2|2011-05-26|2014-02-18|Crown Equipment Limited|Method and apparatus for providing accurate localization for an industrial vehicle|
EP2715286B1|2011-05-31|2020-11-25|John Bean Technologies Corporation|Deep lane navigation system for automatic guided vehicles|
US8548671B2|2011-06-06|2013-10-01|Crown Equipment Limited|Method and apparatus for automatically calibrating vehicle parameters|
US8589012B2|2011-06-14|2013-11-19|Crown Equipment Limited|Method and apparatus for facilitating map data processing for industrial vehicle navigation|
US8594923B2|2011-06-14|2013-11-26|Crown Equipment Limited|Method and apparatus for sharing map data associated with automated industrial vehicles|
US20140058634A1|2012-08-24|2014-02-27|Crown Equipment Limited|Method and apparatus for using unique landmarks to locate industrial vehicles at start-up|
US9056754B2|2011-09-07|2015-06-16|Crown Equipment Limited|Method and apparatus for using pre-positioned objects to localize an industrial vehicle|
US8798840B2|2011-09-30|2014-08-05|Irobot Corporation|Adaptive mapping with spatial summaries of sensor data|
DE102012217085A1|2012-09-21|2014-05-28|Continental Automotive Gmbh|communication system|
US20140249771A1|2013-03-01|2014-09-04|Xue Yang|Location estimation using a mobile device|
DE102013211109A1|2013-06-14|2014-12-31|Robert Bosch Gmbh|Assistance device and method for assisting a driver of the vehicle|
US9354070B2|2013-10-31|2016-05-31|Crown Equipment Corporation|Systems, methods, and industrial vehicles for determining the visibility of features|
DE102014202313A1|2014-02-07|2015-08-13|Robert Bosch Gmbh|Method for transmitting an environment model|
US9886036B2|2014-02-10|2018-02-06|John Bean Technologies Corporation|Routing of automated guided vehicles|
EP3158485A4|2014-06-23|2018-06-20|Sikorsky Aircraft Corporation|Cooperative safe landing area determination|
US9733646B1|2014-11-10|2017-08-15|X Development Llc|Heterogeneous fleet of robots for collaborative object processing|
US10022867B2|2014-11-11|2018-07-17|X Development Llc|Dynamically maintaining a map of a fleet of robotic devices in an environment to facilitate robotic action|
US9465390B2|2014-11-11|2016-10-11|Google Inc.|Position-controlled robotic fleet with visual handshakes|
US9864371B2|2015-03-10|2018-01-09|John Bean Technologies Corporation|Automated guided vehicle system|
MX2017015064A|2015-05-26|2018-05-07|Crown Equip Corp|Systems and methods for materials handling vehicle odometry calibration.|
CN106371452B|2015-07-24|2020-08-25|深圳市道通智能航空技术有限公司|Method, device and system for acquiring and sharing flight-limiting area information of aircraft|
JP6571462B2|2015-09-10|2019-09-04|千代田化工建設株式会社|Construction material management system and construction material management method|
US11100673B2|2015-09-24|2021-08-24|Apple Inc.|Systems and methods for localization using surface imaging|
WO2017053357A1|2015-09-24|2017-03-30|Quovard Management Llc|Method and associated system for providing navigation information to a vehicle|
US9682481B2|2015-10-26|2017-06-20|X Development Llc|Communication of information regarding a robot using an optical identifier|
US10264431B2|2016-02-01|2019-04-16|Caterpillar Inc.|Work site perception system|
WO2017192358A1|2016-05-06|2017-11-09|Pcms Holdings, Inc.|Method and system for collaborative sensing for updating dynamic map layers|
MX2018014334A|2016-05-23|2019-08-12|Crown Equip Corp|Systems and methods for use of a materials handling vehicle in a warehouse environment.|
EP3506238A4|2016-08-26|2019-11-27|Panasonic Intellectual Property Corporation of America|Three-dimensional information processing method and three-dimensional information processing apparatus|
KR20180055571A|2016-11-17|2018-05-25|삼성전자주식회사|Mobile Robot System, Mobile Robot And Method Of Controlling Mobile Robot System|
CN106802654A|2016-11-25|2017-06-06|斑马信息科技有限公司|Move car system and move car method|
US11194334B2|2016-11-30|2021-12-07|Panasonic Intellectual Property Corporation Of America|Autonomous mobile device, autonomous delivery system, delivery method, and non-transitory recording medium|
KR101955630B1|2016-12-08|2019-03-07|삼성중공업|Apparatus and method for managing position of material|
KR101959814B1|2016-12-16|2019-03-21|네이버 주식회사|Physical distribution system and method using indoor autonomous riding robot for handicapped person and indoor autonomous riding robot|
US10866102B2|2016-12-23|2020-12-15|X Development Llc|Localization of robotic vehicles|
WO2018120010A1|2016-12-30|2018-07-05|Intel Corporation|Memory sharing for application offload from host processor to integrated sensor hub|
CN107063275B|2017-03-24|2020-05-12|重庆邮电大学|Intelligent vehicle map fusion system and method based on road side equipment|
KR20190136038A|2017-03-30|2019-12-09|크라운 이큅먼트 코포레이션|Warehouse mapping tool|
AU2018248330A1|2017-04-07|2019-10-31|BXB Digital Pty Limited|Systems and methods for tracking promotions|
DE102017206130A1|2017-04-10|2018-10-11|Robert Bosch Gmbh|Method and apparatus for creating and providing a first card|
US10761541B2|2017-04-21|2020-09-01|X Development Llc|Localization with negative mapping|
WO2018204499A1|2017-05-02|2018-11-08|BXB Digital Pty Limited|Systems and methods for pallet identification|
US10479354B2|2017-05-02|2019-11-19|Cnh Industrial America Llc|Obstacle detection system for a work vehicle|
IT201700076968A1|2017-07-07|2019-01-07|Prefab S R L|METHOD FOR THE MOVEMENT OF GOODS|
BR112020003710A2|2017-08-21|2020-09-01|BXB Digital Pty Limited|systems and methods for monitoring pallets using star architecture|
MA50387A|2017-10-20|2020-08-26|Bxb Digital Pty Ltd|FREIGHT CARRIER TRACKING SYSTEMS AND METHODS|
US11091211B1|2017-12-22|2021-08-17|AI Incorporated|Autonomous versatile vehicle system|
FR3076898B1|2018-01-16|2021-01-01|Balyo|MAPPING METHOD AND DEVICE FOR AUTOMATIC VEHICLES|
US10754343B2|2018-02-15|2020-08-25|X Development Llc|Semantic mapping of environments for autonomous devices|
DE102018207857A1|2018-05-18|2019-11-21|Zf Friedrichshafen Ag|Method and positioning system for transforming a position of a vehicle|
DE102018209603A1|2018-06-14|2019-12-19|Robert Bosch Gmbh|Method and device for controlling self-driving vehicles|
US11227486B2|2018-07-16|2022-01-18|Here Global B.V.|Method, apparatus, and system for estimating vulnerable road users|
CN110945498A|2018-07-24|2020-03-31|谷歌有限责任公司|Map uncertainty and observation model|
US20200050205A1|2018-08-07|2020-02-13|Cnh Industrial America Llc|System and method for updating a mapped area|
US20200102147A1|2018-09-28|2020-04-02|Autoguide, LLC|Commercial logistic facility, configurable modular robotic autonomous guided vehicle, and method therefor|
US10824160B2|2018-09-28|2020-11-03|X Development Llc|Robot localization with co-located markers|
US10852740B2|2018-09-28|2020-12-01|X Development Llc|Determining the orientation of flat reflectors during robot mapping|
US11017674B1|2018-10-15|2021-05-25|Waymo Llc|Managing and tracking scouting tasks using autonomous vehicles|
CN109282821B|2018-11-28|2019-12-03|浙江明度智控科技有限公司|A kind of garden air navigation aid and system|
US10816637B2|2018-12-27|2020-10-27|Chep Technology Pty Limited|Site matching for asset tracking|
US20200210719A1|2018-12-28|2020-07-02|Sony Mobile Communications Inc.|Optical vehicle positioning|
US20200209870A1|2018-12-28|2020-07-02|Continental Automotive Systems, Inc.|Collection and distribution of environmental information in limited access areas|
CN113424237A|2019-02-25|2021-09-21|Bxb数码私人有限公司|Intelligent physical closure in supply chain|
CN110207710B|2019-06-26|2021-03-16|北京小狗智能机器人技术有限公司|Robot repositioning method and device|
US11206677B2|2019-08-14|2021-12-21|Toyota Motor North America, Inc.|Sharing vehicle map data over transmission media selected according to urgency of the map data|
DE102019214115A1|2019-09-17|2021-03-18|Robert Bosch Gmbh|Method for distributing navigation map data to a plurality of work machines|
US20210347617A1|2020-05-11|2021-11-11|Autoguide, LLC.|Engaging an element|
法律状态:
2020-08-18| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2020-08-25| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-09-08| B25A| Requested transfer of rights approved|Owner name: CROWN EQUIPMENT CORPORATION (US) |
2020-10-20| B08F| Application dismissed because of non-payment of annual fees [chapter 8.6 patent gazette]|Free format text: REFERENTE A 8A ANUIDADE. |
2020-12-08| B11B| Dismissal acc. art. 36, par 1 of ipl - no reply within 90 days to fullfil the necessary requirements|
2021-11-03| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
US13/159500|2011-06-14|
US13/159,500|US8594923B2|2011-06-14|2011-06-14|Method and apparatus for sharing map data associated with automated industrial vehicles|
PCT/NZ2012/000092|WO2012173497A1|2011-06-14|2012-06-08|Method and apparatus for sharing map data associated with automated industrial vehicles|
[返回顶部]