![]() SYSTEM AND METHOD FOR MASKING DYNAMIC IMAGE
专利摘要:
A dynamic image masking system having a remote sensing platform (14) and an imaging system (12) associated with the remote sensing platform (14) and having an optical system (20) and an image detection system (22). The dynamic image masking system further includes a multi-level security system (42) associated with the imaging system (12) and one or more image modification locations in the imaging system (12) and the imaging system (12). multi-level security system (42), wherein a modification of the images occurs via the dynamic image masking process. The dynamic image masking system further includes a computer system associated with the imaging system (12) and having a controller algorithm configured to send controller commands to one or more controllers that control the one or more change locations. image through the dynamic image masking process. 公开号:FR3026540A1 申请号:FR1559008 申请日:2015-09-24 公开日:2016-04-01 发明作者:Charles B Spinelli;Robert W Turner 申请人:Boeing Co; IPC主号:
专利说明:
[0001] DYNAMIC IMAGE MASKING SYSTEM AND METHOD Area of Disclosure The disclosure generally relates to systems and methods for remote sensing image collection, and more particularly to automated dynamic image masking systems and methods for data collection. remote sensing images, such as aerial remote sensing image collection. [0002] Description of the Related Art Remote sensing, such as aerial remote sensing, involves the use of sensor and imaging technologies, such as radar imaging systems, camera imaging systems, wave detection and telemetry systems. Light Detection And Ranging (LIDAR) and other sensor and imaging systems, to obtain images of the ground and surface of the Earth and distant objects. Analog aerial photography, videography and digital photography are commonly used in airborne remote sensing to collect images. Digital photography allows real-time transmission of remotely sensed data to a ground station or base station for immediate analysis, and digital images can be analyzed and interpreted using a computer. Nevertheless, the collection of aerial remote sensing imagery can be tedious, and if the imaging system is not correctly oriented when an image capture event occurs, a large amount of data can be generated. information not very useful. For precision agriculture (AP) -based aerial remote sensing missions that can collect images of agricultural fields and crops to determine the health and vigor of plants, an operator must be able to deal with large areas of land, unique and well-known distribution of image collection sites, a well defined flight profile in terms of range, flight time, altitude, position and speed, and distinct image exclusion areas, the latter can significantly hinder precision agriculture (AP) -based aerial remote sensing operations if they are not justified. For example, when flying a precision agriculture-based aerial remote-sensing mission, an air vehicle, such as an unmanned aerial vehicle (UAV), may have access to a first farm by flying over other areas. 10 populated. It may not be desirable to begin imaging before being in position on the first farm, so it is desirable that an automatic control of the VASP and the imaging system is integrated and allows autonomous operations. Known systems and methods for collecting aerial remote sensing imagery may include pilot-controlled flight with the pilot located in the local field with a view of the entire area, and thus may not allow autonomous operations, which may be preferred for a precision agriculture market. In addition, without a well-defined collection area, there may be too much land or area to be imaged and the amount of image data collected may overload the sensor and the imaging system. In addition, data that is restricted or out of range and not in a defined collection area can be imaged and inadvertently collected. In addition, known aerial remote sensing imaging collection systems and methods may include manual operation shutter control which may be preprogrammed (every n seconds) or operator initiated. However, with such a manual operation used in the precision farming market, it may be necessary to have access to a particular field which is designated to be imaged. This option may require flying beyond a line of sight to get to the right destination, especially when flying at low altitudes, less than 400 feet above ground level. Nevertheless, such an option may be labor-intensive, expensive and may not provide the desired results for addressing the precision agriculture market. Another option may be to have an active link between the imaging camera system and the ground controller (pilot and ground control station operator) that provides a bird's eye view of the area. This can be used to alert operators when it's time to maneuver and when it's time to take a picture. Nevertheless, this option may be greedy labor and may not meet all the requirements of a precision farming mission. Accordingly, there is a need in the art for an improved system and method for a dynamic image masking system for providing a stand-alone remote sensing image through a dynamic image masking process and provide a way to hide or modify pixels that are unwanted or unrelated to an image gathering event or mission, such as a precision agriculture mission, which provides advantages over known systems and methods. SUMMARY Exemplary embodiments of the present disclosure provide an improved system and method for a dynamic image masking system that provides a filtered autonomous remote sensing image through a dynamic image masking process for overcome limitations of existing solutions. As discussed in the detailed description below, embodiments of improved systems and methods for a dynamic image masking system that provides a filtered autonomous remote sensing image through a dynamic image masking process. can provide significant advantages over existing systems and processes. In one embodiment of the disclosure, there is provided a dynamic image masking system for providing a filtered autonomous remote sensing image via a dynamic image masking process. The dynamic image masking system includes a remote sensing platform. The dynamic image masking system further includes an imaging system associated with the remote sensing platform. The imaging system includes an optical system and an image sensing system. [0003] The dynamic image masking system further includes a multi-level security system associated with the imaging system. The dynamic image masking system further includes one or more image editing locations in the imaging system and the multi-level security system, where a modification of the one or more images occurs via the masking process of the image. dynamic image. The dynamic image masking system further includes a computer system associated with the imaging system. The computer system includes a controller algorithm configured to send controller commands to one or more controllers that control the one or more image modification locations through the dynamic image masking process. In another embodiment of the disclosure, there is provided a method of providing a filtered autonomous remote sensing image via a dynamic image masking process. The method includes the step of equipping a remote sensing platform with an imaging system. The method further includes the step of designating an imaging area to obtain a designated area to be imaged. The method further includes the step of establishing a plurality of reference points on a surface of the designated area to be imaged. The method further includes the step of designating a plurality of specific surface areas as an excluded area not to be imaged with reference to the plurality of reference points. The method further includes the step of controlling a pre-established collection planning process covering the designated area to be imaged. The method further comprises the step of using a navigation system comprising a Global Positioning System (GPS), a radio navigation system, an optical navigation system, an Inertial Measurement Unit (IMU) system. , an inertial measurement unit (IMU) system equipped with a magnetometer, or one of their combinations, for positioning the imaging system to image the designated area to be imaged. The method further includes the step of using the imaging system to image the designated area to be imaged that is covered through the pre-established collection planning process. The method further includes the step of dynamically nullifying one or more pixels in one or more images of the excluded area. The method further includes the step of obtaining an autonomous remote sensing image filtered by dynamic image masking of the designated area to be imaged. In another embodiment of the disclosure, there is provided a method of providing a self-contained remote sensing image filtered by a dynamic image masking process. The method includes the step of equipping an unmanned aerial vehicle (VASP) with an imaging system. The method further includes the step of designating an imaging area to obtain a designated area to be imaged. The method further includes the step of establishing a plurality of reference points on a surface of the designated area to be imaged. [0004] The method further includes the step of designating a plurality of specific surface areas as an excluded area not to be imaged with reference to the plurality of reference points. The method further includes the step of controlling a pre-established flight plan of the VASP covering the designated area to be imaged. The method further comprises the step of using a navigation system comprising a Global Positioning System (GPS), a radio navigation system, an optical navigation system, an Inertial Measurement Unit (IMU) system. , an inertial measurement unit (IMU) system equipped with a magnetometer, or one of their combinations, for positioning the imaging system to image the designated area to be imaged. The method further includes the VASP overflight step of the designated area to be imaged, and use of the imaging system to image the designated area to be imaged that is covered by the pre-established flight plan of the VASP. The method further includes the step of dynamically nullifying one or more pixels in one or more images of the excluded area. The method further includes the step of obtaining an autonomous remote sensing image filtered by dynamic image masking of the designated area to be imaged. The features, functions and advantages that have been discussed may be accomplished independently in various embodiments of the disclosure or may be combined in yet other embodiments of which further details may be seen with reference to the description and drawings. following. Brief Description of the Drawings The disclosure may be better understood by reference to the following detailed description taken in conjunction with the accompanying drawings which illustrate exemplary preferred embodiments, but which are not necessarily drawn to scale, in which: FIG. 1 is an illustration of a system block diagram of an embodiment of an imaging system that can be used in one embodiment of a dynamic image masking system and a method of disclosure ; Fig. 2 is an illustration of a system block diagram of an embodiment of a dynamic image masking system of the disclosure with the imaging system of Fig. 1 and showing a controller algorithm and various image editing locations in the dynamic image masking system; Fig. 3 is an illustration of a functional block diagram of an embodiment of a controller algorithm used in one embodiment of a dynamic image masking system and a method of disclosure; Fig. 4A is an illustration of a system block diagram of one embodiment of a dynamic image masking system of the disclosure; Fig. 4B is an illustration of a system block diagram of one embodiment of a dynamic image masking method of the disclosure; Fig. 5A is an illustration of a flowchart of an embodiment of a method of disclosure; Fig. 5B is an illustration of a flowchart of another embodiment of a method of disclosure; Fig. 6 is an illustration of a schematic representation of an unmanned aerial vehicle (VASP) that can be used in one embodiment of a dynamic image masking system and a method of disclosure; Fig. 7 is an illustration of a flowchart of an embodiment of an aircraft manufacturing and maintenance process; and Fig. 8 is an illustration of a functional block diagram of an embodiment of an aircraft. [0005] DETAILED DESCRIPTION Embodiments disclosed herein will now be more fully described with reference to the accompanying drawings, in which some, but not all, of the disclosed embodiments are shown. Indeed, several different embodiments can be proposed and should not be construed as being limited to the embodiments presented herein. On the contrary, these embodiments are provided so that the disclosure is complete and fully conveys the scope of the disclosure to those skilled in the art. [0006] With reference to the figures, FIG. 1 is an illustration of a system block diagram of an embodiment of an imaging system 12 that may be used in one embodiment of a masking system. dynamic image 10 (see FIG. 2), a method 150 (see FIG. 5A), and a method 170 (see FIG. 5B) of the disclosure. [0007] As shown in Fig. 1, the imaging system 12 is associated with a remote sensing platform 14, and includes an optical system 20 and an image sensing system 22. The optical system 20 (see Fig. 1) , such as a camera 20a (see also FIG. 4A), represents an optical view of the entire world. A collection planning process 16 (see Fig. 1) provides a collection planning process output 18 (see Fig. 1) to the optical system 20 (see Fig. 1). The optical system 20 (see Fig. 1) provides raw image data output 24 to a focal plane array 26 (see Fig. 1) of the focal plane array subsystem 22a (see Fig. 1). As shown in FIG. 1, the image detection system 22 (see FIG. 1) may comprise the focal plane network subsystem 22a (see FIG. 1), and the focal plane network subsystem 22a comprises the focal plane array 26, an analog-to-digital converter (A / D) 30, a volatile temporary memory 34, a digital signal processor 38 and a digital-to-analog converter (DIC) 54. [0008] The focal plane array 26 (see Fig. 1) reads the raw image data 24 (see Fig. 1) and transmits them to the analog-to-digital converter 30 (see Fig. 1). The analog-to-digital converter 30 (see Fig. 1) provides the analog-to-digital converter output 32 (see Fig. 1) to the volatile temporary memory 34 (see Fig. 1) where an image 122 (see Fig. 4A) is temporarily stored (a next image overwrites a current image). The volatile temporary memory 34 (see Fig. 1) then provides a volatile temporary memory output 36 (see Fig. 1) to the digital signal processor 38 (see Fig. 1). Several actions can take place in the digital signal processor 38 (see FIG. 1) including, for example, the reading of digital signals 37 (see FIG. 4B), the gain adjustment, the processing of the digital signals 37 (see FIG. Figure 4B) through a Bayer filter (i.e., an array of color filters (RFC) for arranging RGB (Red-Green-Blue) color filters on a square grid of photosensors in a format JPG file, for example), and to perform image enhancement techniques such as contour enhancement. After the digital signal 37 (see Fig. 4B) is processed in a readable image format 39 (see Fig. 4B) by the digital signal processor 38 (see Fig. 1), the digital signal processor 38 (see FIG. Fig. 1) provides the digital output 40 (see Fig. 1) for storage in a nonvolatile resulting memory 44 (see Fig. 1) of a multilevel security system 42 (see Fig. 1). From the resulting nonvolatile memory 44 (see Fig. 1), a resultant nonvolatile memory output 46 (see Fig. 1) can be provided to a post-processing process 48 (see Fig. multi-level security 42 (see Figure 1) for post-processing, if necessary. The post-processing process 48 (see Fig. 1) provides a post-processed output product 49 (see Fig. 1). If the optical system 20 is analog, the digital signal processor 38 (see Fig. 1) provides a digital signal processor output signal 52 (see Fig. 1) to the digital-to-analog converter 54 (see Fig. 1), and the digital to analog converter 54 (see Fig. 1) converts the signal to analog and provides the analog output 56 (see Fig. 1). Analog output 56 (see FIG. 1) may be used or stored in a video editing system 58 (see FIG. 1). [0009] In one embodiment of the disclosure, a dynamic image masking system 10 (see FIGS. 2, 4A) is provided for providing a filtered autonomous remote sensing image 51 (see FIGS. 2, 4A) via a dynamic image masking process 11 (see FIGS. 4A-4B). Fig. 2 is an illustration of a system block diagram of an embodiment of a dynamic image masking system 10 of the disclosure with the imaging system 12 of Fig. 1 and showing an algorithm of controller (GK) 60 and various image modification locations 90 in the dynamic image masking system 10. Fig. 2 shows the image modification locations 90 where one or more pixels 126 (see Fig. 4B) in one or more images 122 (see Fig. 4A) can be modified to produce a masked image 50, such as the filtered autonomous remote sensing image 51. Fig. 4A is an illustration of a system block diagram of one embodiments of the dynamic image masking system 10 of the disclosure. Fig. 4B is an illustration of a system block diagram of one of the embodiments of a dynamic image masking process 11 of the disclosure. Before the dynamic image masking system 10 shown in FIG. 2 and FIG. 4A is discussed in detail, reference will be made to the controller algorithm 60 (see FIGS. 2, 3, 4B). Fig. 3 is an illustration of a functional block diagram of an embodiment of the controller algorithm 60 used in one embodiment of the dynamic image masking system 10 (see Fig. 2), the method 150 (see FIG. 5A), and method 170 (see FIG. 5B) of the disclosure. Figure 3 shows a controller function (GK) 61. As used herein, "algorithm" means a set of instructions or a list of steps for performing a task or solving a problem. The controller algorithm 60 (see FIG. 3) calculates the from a pixel 126 (see FIG. 4B), for example, from the ground, and determines whether the pixel 126 (see FIG. 4B) is in an area imaging 118 (see Figure 4A) or not. If the pixel 126 (see Fig. 4B) is in the imaging area 118 (see Fig. 4A), then the pixel 126 (see Fig. 4B) of an image 122 (see Fig. 4B) is captured. If the pixel 126 (see Fig. 4B) is not in the imaging area 118 (see Fig. 4A), then the pixel 126 (see Fig. 4B) is replaced by an appropriate value, where an exact value depends on the pixel replacement method used. As shown in FIG. 3, the controller algorithm 60 preferably takes a location (GPS) 108, as obtained with a global positioning system (GPS), and an attitude (UMI) 104, as obtained with an inertial measurement unit (IMU), a remote sensing platform 14, for example, an airborne platform 14a (see Fig. 4B) in the form of an unmanned aerial vehicle 200 (see Figs. 4B , 6). Preferably, the GPS and UMI data are high fidelity data to avoid any concern with attitude (UMI) 104 or location (GPS) 108 or positioning, which may affect a designated area to be imaged 118a (see FIG. 4A). As further shown in FIG. 3, the controller algorithm 60 may also take information such as time 102, telemetry sensor 106, altitude, speed, flight profile, or other information from the controller. remote sensing platform 14 (see Figure 2). As further shown in Figure 3, the controller algorithm 60 preferably applies a set of rules 92 that may contain a camera model 94, which includes parameters 95 (see Figure 4B), such as a field of view. vision 95a (see FIG. 4B) and a focal length 95b (see FIG. 4B); masking orders 96; information relating to collection planning 98; information relating to the collection policy 100; or other suitable information to generate a masked image 50 (see FIG. 2). In this manner, the controller algorithm 60 (see Fig. 3) provides controller commands (GK) (see Figs. 2, 3) to one or more controllers 63 (see Fig. 2) saying which of the one or more pixels 126 (see Figure 4B) must be modified. In one embodiment, the masking order 96 (see Figure 3) may include a dynamically updated moving masking command 96a (see Figure 4B) for fixed or mobile objects or persons to publicly broadcast their location or location. position. For example, this embodiment allows a first person not wanting to be photographed in public to broadcast with a device, such as a mobile phone 97 (see Figure 4B), its location or second person taking pictures. The optical system 20 (see FIG. 2) of the second person, such as a camera 20a (see FIG. 2), or a sensor would receive the position of the first person and determine if the first person can be identified in the frame. of camera, based on the camera model 94 (see Fig. 4B), and camera parameters 95 (see Fig. 4B), such as the field of view 95a (see Fig. 4B), the focal length 95b (See Figure 4B), adjustments, or other suitable camera settings 95 (see Figure 5B). If so, the optical system 20 (see Fig. 2), such as the camera 20a (see Fig. 2), or the sensor would obscure or blur the image of the first person. [0010] The dynamic image masking system 10 (see Figs. 2, 4A) is preferably an automated image collection system 148 (see Fig. 4A) which includes the controller algorithm 60 (see Figs. 2, 3 4B) which provides controller commands 62 (see FIGS. 2, 3, 4B) to one or more controllers 63 (see FIGS. 2, 4B) which control via the dynamic image masking process 11 (see FIGS. Figs. 4A to 4B) one or more image modification locations 90 (see Fig. 2) located in the dynamic image masking system 10 (see Fig. 2). As used herein, "dynamic image masking" means masking, erasing, blocking, crushing, light saturation (blindness), non-collection, removal, stress, or otherwise modifying one or more pixels 126 (see Fig. 4B) in one or more images 122 (see Fig. 4A) of the excluded area 124 (see Fig. 4A) where the one or more pixels 126 (see Fig. 4B) ) are undesirable, foreign, or restricted. The dynamic image masking system 10 (see Figs. 2, 4A) and the dynamic image masking process 11 (see Figs. 4A-4B) produce a masked image 50 (see Fig. 4A), such as a filtered autonomous remote sensing image 51 (see FIG. 4A), which is reliable and repeatable and which preferably leads to an established product with only the use of pixels 126 (see FIG. 4B) of interest. The one or more pixels 126 (see Fig. 4B) that undergo the dynamic image masking process 11 can lead to, for example, masked pixels 126a (see Fig. 4B), erased pixels 126b (see Fig. 4B). ), uncharged pixels 126c (see FIG. 4B), crushed pixels 126d (see FIG. 4B), saturated pixels of light 126e (see FIG. 4B), or other appropriately modified pixels. [0011] As shown in Figs. 2, 4A, the dynamic image masking system 10 includes the imaging system 12 associated with a remote sensing platform 14, as shown in Fig. 1. The imaging system 12 (see FIG. FIGS. 2, 4A) may comprise a two-dimensional imaging system 12a (see FIG. 4A), a three-dimensional imaging system 12b (see FIG. 4A) such as stereo imaging, or another imaging system 12 adequate (see Figure 4A). As shown in FIGS. 2, 4A, the imaging system 12 comprises an optical system 20 and an image detection system 22. [0012] As shown in FIG. 4A, the remote sensing platform 14 may comprise an airborne platform 14a, such as an unmanned aerial vehicle 200 (see FIG. 6), a ground based platform 14b, a flatbed 14c-based platform, or a 14d-based water platform. The remote sensing platform 14 (see Figure 4A) may also include another suitable platform. As shown in Figs. 2, 4A, the dynamic image masking system 10 further comprises a multi-level security system 42 associated with the imaging system 12. The multi-level security system 42 (see Fig. 2) includes memory non-volatile resultant 44 (see FIG. 2) and the post-processing process 48 (see FIG. 2). The resulting nonvolatile memory 44 (see FIG. 2) may include any suitable computer readable storage medium, such as a ROM, RAM, VRAM, hard disk , a floppy disk, a compact disc (CD), a magnetic tape, a combination thereof, or other suitable computer-readable memory device. The multi-level security system 42 (see Figures 2, 4A) is preferably required to maintain the integrity of the data for the images 122 (see Figure 4B). The multilevel security system 42 (see FIGS. 2, 4A) controls access to the dynamic image masking system 10 as well as access to information about individual pixels 126 (see FIG. 4B). As shown in Figures 2, 4B, the dynamic image masking system 10 further includes one or more image modification locations 90 preferably located in the imaging system 12 and the multi-level security system 42. modification of one or more images 90 occurs via the dynamic image masking process 11. One or more image modification locations 90 (see FIG. 2) may also be located outside the imaging system 12 (See Figure 2) and Multilevel Security System 42 (see Figure 2). [0013] The dynamic image masking system 10 (see FIGS. 2, 4A) preferably comprises a collection planning stage 112 (see FIG. 4A), a collection phase 114 (see FIG. 4A), and a post phase processing 116 (see FIG. 4A) for planning, collecting and post-processing of one or more images 122 (see FIG. 4A) collected during an event or a collection mission. For the collection planning phase 112 (see Fig. 4A), the dynamic image masking system 10 (see Figs. 2, 4A) may preferably include a pre-established collection planning process 16 (see Figs. 2, 4A). For example, the pre-established collection planning process 16 (see Figure 4A) may include a pre-established flight plan 17 (see Figure 4A) of an airborne platform 14a (see Figure 4A), such as a unmanned aerial vehicle 200 (see Figure 6). The pre-established collection planning process 16 (see FIGS. 1, 2, 4A) preferably includes the determination of an excluded area 124 (see FIG. 4A) not to be imaged with the imaging system 12 (see FIGS. 1, 2, 4A) prior to an event or a collection mission, for example, prior to overflight by the unmanned aerial vehicle 200 (see Figure 6) of an imaging area 118 (see Figure 4A). The pre-established collection planning process 16 (see FIGS. 1, 2, 4A) dynamically plans which areas are not to collect images 122 (see FIG. 4A) and to exclude such areas from the plan. collection or mission prior to the commencement of an event or collection mission. The pre-established collection planning process 16 (see Figures 2, 4A) can be conducted as a manual process or an automated process. The automated process preferably uses the controller algorithm (GK) 60 (see FIG. 2), such as the controller algorithm (GK) 60a (see FIG. 2), configured to send a controller command 62 (see FIG. Fig. 2), such as controller command 62a (see Fig. 2), to the pre-established collection planning process 16 (see Fig. 2) at an image modification location 90 (see Fig. 2). , such as the image modification location 90a (see Fig. 2). The controller command 62 (see Fig. 2), such as controller command 62a (see Fig. 2), may preferably comprise in this collection planning step 112 (see Fig. 4A) a set of rules 92 (see Figure 3) including a collection schedule 98 (see Figure 3), and a collection policy 100, or other appropriate rules and policies. Collection planning 98 (see Figure 3) and Collection Policy 100 (see Figure 3) preferably include the implementation of specific privacy policies and rules in force in the area, region, the state, country, and / or nation of the event or collection mission. As shown in Fig. 2, the image modification location 90, such as the image modification location 90a, is located before inputting to the optical system 20 of the imaging system 12. As shown in FIG. in addition to Fig. 2, the collection planning process output 18 is provided through the collection planning process 16 and inputted to the optical system 20. For the collection phase 114 (see Fig. 4A) of the system of dynamic image masking 10 (see FIGS. 2, 4A), the imaging system 12 (see FIGS. 2, 4A) is preferably used to designate an imaging area 118 (see FIG. 4A) to obtain a designated area to be imaged 118a (see Figure 4A). A plurality of reference points 120 (see Fig. 4A) may be established on a surface 118b (see Fig. 4A) of the designated area to be imaged 118a (see Fig. 4A). A plurality of specific surface areas 124a (see Fig. 4A) may be designated as an excluded area 124 (see Fig. 4A) not to be imaged with reference to the plurality of reference points 120 (see Fig. 4A). The dynamic image masking system 10 (see FIG. 4A) may further comprise a navigation system 110 (see FIG. 4A) for positioning the imaging system 12 (see FIGS. 2, 4A) for the purpose of image the designated area to be imaged 118a (see Figure 4A). The navigation system 110 (see FIG. 4A) may include a Global Positioning System (GPS) 110a (see FIG. 4A), a radio navigation system 110b (see FIG. 4A), an optical navigation system 110c (see FIG. Fig. 4A), an inertial measurement unit (IMU) system 110d (see Fig. 4A), an inertial measurement unit (IMU) system equipped with a magnetometer 110e (see Fig. 4A), one of their combination, or another suitable navigation system (see Figure 4A). As shown in FIGS. 1, 2, 4A, the optical system 20 may comprise a camera 20a. Preferably, the camera 20a (see Figs. 1, 2, 4A) is a digital camera 20b (see Fig. 4A). The optical system 20 (see FIGS. 1, 2, 4A) may also include other suitable advanced optics or camera devices. As mentioned above, the optical system 20 represents an optical view of the whole world. As shown in Figs. 2, 4B, the dynamic image masking system 10 further comprises an optical blindness system 64 located between the optical system 20 and the image detection system 22. As further shown in Figs. 2, 4B, the controller algorithm (GK) 60, such as the controller algorithm (GK) 60b, is configured to send a controller command 62, such as the controller command 62b, to a controller 63, such as the optical blindness system 64, for controlling an image modification location 90, such as the image modification location 90b. The controller algorithm 60b (see Fig. 2) is configured to send the controller command 62b (see Fig. 2), either mechanically or optically, to the optical blindness system 64 (see Fig. 2) which controls the image modification location 90b (see Fig. 2) located between the optical system 20 (see Fig. 2) and the image detection system 22 (see Fig. 2). With this embodiment of the optical blindness system 64 (see FIG. 2), no foreign pixels are processed because the pixels 126 (see FIG. 4B) are modified before recording on the focal plane array 26 (see FIG. 2). ) of the image detection system 22 (see Figure 2). The collection of photons by the pixel 126 can be inhibited (see Figure 4B), or the pixel 126 (see Figure 4B) can be saturated with light by illuminating it 100% to cause a "blindness" to occur . The optical blindness system 64 (see FIGS. 2, 4B) may be used with image modification equipment 65 (see FIG. 4B). The image modification equipment 65 (see Fig. 4B) may comprise a mechanical device 66 (see Fig. 4B), such as a mechanical shutter control device 66a (see Fig. 4B), which may be used to inhibit photon collection by a plurality of pixels 126 (see Figure 4B). Alternatively, the image modification equipment 65 (see Fig. 4B) may comprise an optical device 67 (see Fig. 4B), such as a laser optical device 67a (see Fig. 4B) and a micromirror optical device 67b (see FIG. 4B), which can be used to illuminate specific pixels 126 (see FIG. 4B) on the focal plane array 26 (see FIG. 2), resulting in pixel blindness 126 (see FIG. 4B). The controller command 62 (see FIG. 2), such as the controller command 62b (see FIG. 2), may preferably comprise in this collection phase 114 (see FIG. 4A) the set of rules 92. (see Fig. 3) including camera model 94 (see Fig. 3), masking commands 96 (see Fig. 3), collection planning 98, collection policy 100, or other rules and policies adequate. Controller command 62 (see Fig. 2), such as controller command 62b (see Fig. 2), may further preferably include in this collection phase 114 (see Fig. 4A), time 102. (See Figure 3), the attitude (UMI) 104 (see Figure 3), the telemetry sensor 106 (see Figure 3), and / or the location (GPS) 108 (see Figure 3). As shown in Fig. 1, the optical system 20 outputs raw image data 24 obtained with the optical system 20 and inputs raw image data 24 to the image detection system 22 of the imaging system. imaging 12. As shown in FIG. 2, by the use of the controller algorithm 60, such as the controller algorithm 60b, and the optical blindness system 64 which uses an optical blindness system output 68 to control the image modification location 90, such as the image modification location 90b, through the dynamic image masking process 11 (see FIGS. 4A-4B), the one or more pixels 126 (see Figure 4B) are hidden or modified. Thus, by the use of the controller algorithm 60 (see FIG. 2), such as the controller algorithm 60b (see FIG. 2), and the optical blindness system 64 (see FIG. Masked raw image data 24a (see Fig. 2) is inputted to the focal plane array 26 (see Fig. 2) of the image detection system 22 (see Fig. 2). As shown in FIGS. 2, 4A, the image detection system 22 comprises the focal plane array subsystem 22a comprising the focal plane array 26, the analog-to-digital converter (A / D) 30, the volatile temporary memory 34, the digital signal processor 38, and the digital-to-analog (D / A) converter 54. As the focal plane network subsystem 22a (see Fig. 2) can be an integrated circuit, the network subsystem focal plane 22a (see Fig. 2) may require some disintegration to interrupt a signal at a desired image modification location 90 (see Fig. 2) in the focal plane network subsystem 22a (see Fig. 2) of the image detection system 22 (see Fig. 2), such as for example, at the image modification locations 90c, 90d, and 90e. [0014] As shown in FIG. 4A, the image detection system 22 may further comprise a radar imaging system 22b, a sonar imaging system 22c, an infrared system 33d, an X-ray imaging system 22e, a detection and tracking system Light Radar Telemetry (LIDAR) 22f, or other suitable image detection system 22. [0015] As shown in FIG. 2, the controller algorithm 60, as in the form of the controller algorithm 60c, is configured to send a controller command 62, such as in the form of the controller command 62c, to a controller 63, such as a pixel controller 69. The pixel controller 69 (see Fig. 2) uses a pixel controller output 70 to control the image modification location 90, such as the location. 90c, through the dynamic image masking process 11 (see FIGS. 4A-4B), by overwriting one or more pixels 126 (see FIG. 2) on the focal plane array 26 (see FIG. Fig. 2) is at zero saturation 140 (see Fig. 4A) or at 100 percent saturation 142 (see Fig. 4A). [0016] With this pixel controller embodiment 69 (see FIG. 2), the pixel controller 69 (see FIG. 2) supplies the focal plane array 26 (see FIG. 2) and essentially overwrites the pixels 126 (see FIG. 4B) on the focal plane array 26 (see FIG. 2) with a saturation of either 0 (zero) (corresponds to 0) or 100% (one hundred percent) (this level can correspond to a value of 256 for a system at 8 bits). As shown in FIG. 1, the focal plane array 26 provides the focal plane network output 28 and provides the input of the focal plane array output 28 to the analog-to-digital converter 30. As shown in FIG. 2, with the use of the controller algorithm 60, such as the controller algorithm 60c, and the pixel controller 69 that uses the pixel controller output 70 to control the image modification location 90, such as the modification location 90c, through the dynamic image masking process 11 (see FIGS. 4A-4B), the one or more pixels 126 (see FIG. 4B) are masked or overwritten. Thus, by use of the controller algorithm 60 (see Fig. 2), such as the controller algorithm 60c (see Fig. 2), and the pixel controller 69 (see Fig. 2), an output Mapped focal plane grating 28a (see Fig. 2) is input to the analog-to-digital converter 30. As further shown in Fig. 2, the analog-to-digital converter 30 receives the masked focal plane array output 28a (see FIG. Fig. 2), which is preferably in the form of masked raw image data 24a, from the focal plane array 26. The analog-to-digital converter 30 (see Fig. 2) converts the masked raw image data 24a of digital analog signals 37 (see Figure 4B). As shown in Fig. 2, the controller algorithm 60, as in the form of the controller algorithm 60d, is configured to send a controller command 62, such as in the form of the controller command 62d, to a controller 63, such as a scan controller 72. The scan controller 72 (see Fig. 2) uses a scan controller output 74 to control the image edit location 90, such as the location. 90d, through the dynamic image masking process 11 (see FIGS. 4A-4B). The image modification location 90d (see Fig. 2) is located between the analog-to-digital converter 30 (see Fig. 2) and the volatile temporary memory 34 (see Fig. 2). The one or more pixels 126 (see Fig. 4B) are preferably modified at the image modification location 90d by setting a digitized value 146 (see Fig. 4B) for the one or more pixels 126 (see Fig. 4B). or a minimum value 146a (see Figure 4B) or a maximum value 146b (see Figure 4B). With this embodiment of the scanning controller 72 (see Fig. 2), the scanning controller 72 (see Fig. 2) controls scanning after the analog-to-digital converter 30 (see Fig. 2) by setting the digitized value 146 (see Fig. 4B) either high (minimum value 146a (see Fig. 4B)) or low (maximum value 146b (see Fig. 4B)). In this way, the signals for some pixels 126 (see Fig. 4B) are essentially lowered (low set value) or raised (high set value). This embodiment can be used with a custom interface electronics, for example, a wired OR function 144 (see Figure 4B) which is a hardware implementation of a Boolean operator. The wired OR function 144 (see FIG. 4B) electrically performs a Boolean logic operation of an OR gate, using a chipping resistance and a diode per input. As shown in FIG. 1, the analog-to-digital converter 30 provides an analog-to-digital converter output 32 and outputs the analog-to-digital converter output 32 to the volatile temporary memory 34. As shown in FIG. use of the controller algorithm 60, such as the controller algorithm 60d, and the scan controller 72 that uses the scan controller output 74 to control the image modification location 90, such as the image modification location 90d, through the dynamic image masking process 11 (see FIGS. 4A-4B), the one or more pixels 126 (see FIG. 4B) are masked or overwritten. Thus, by the use of the controller algorithm 60 (see FIG. 2), such as the controller algorithm 60d (see FIG. 2), and the scanning controller 72 (see FIG. 2), an output masked analog-to-digital converter 32a (see FIG. 2) is input to the volatile temporary memory 34 (see FIG. 2). [0017] As further shown in Fig. 2, the volatile temporary memory 34 receives the masked analog-to-digital converter output 32a, which is preferably in the form of digital signals 37 (see Fig. 4B), of the analog-to-digital converter 30. The volatile temporary memory 34 temporarily stores the digital signals 37 (see Fig. 4B) from the analog-to-digital converter 30. As shown in Fig. 2, the controller algorithm 60, as in the form of the controller algorithm 60e, is configured to send a controller command 62, such as in the form of the controller command 62e, to a controller 63, such as a digital flow controller 76. The digital flow controller 76 (see FIG. 2) uses a digital flux controller output 78 to control the image modification location 90, such as the 90e image modification location, through the image masking process e dynamic 11 (see FIGS. 4A to 4B). The image modification location 90e (see Fig. 2) is located between the volatile temporary memory 34 (see Fig. 2) and the digital signal processor 38 (see Fig. 2). The one or more pixels 126 (see FIG. 4B) are preferably modified at the 90th image modification location by modifying a single image 122 (see FIG. 4A) at a time and hiding the one or more pixels 126 (see Figure 4B) in single image 122 (see Figure 4A). With this embodiment of digital flow controller 76 (see Fig. 2), the volatile temporary memory 34 (see Fig. 2) outputs a single image 122 (see Fig. 4A) to both the digital signal processor. 38 (see Figure 2). This occurs because the volatile temporary memory 34 (see Figure 2) is overwritten for every single image 122 (see Figure 4A) that is processed. As shown in FIG. 1, the volatile temporary memory 34 provides the volatile temporary memory output 36 and inputs the volatile temporary memory output 36 to the digital signal processor 38. As shown in FIG. the controller algorithm 60, such as the 60e controller algorithm, and the digital flow controller 76 that uses the digital flow controller output 78 to control the image modification location 90, such as the location 90e, through the dynamic image masking process 11 (see Figs. 4A-4B), the one or more pixels 126 (see Fig. 4B) are masked or overwritten. Thus, by use of the controller algorithm 60 (see Fig. 2), such as the 60e controller algorithm (see Fig. 2), and the digital flow controller 76 (see Fig. 2), a Masked volatile temporary memory output 36a (see Fig. 2) is input to the digital signal processor 38. The digital signal processor 38 (see Fig. 2) receives the digital signals 37 (see Fig. 4B) from the memory volatile temporary 34 (see FIG. 2) and processes the digital signals 37 (see FIG. 4B) in a readable image format 39 (see FIG. 4B). When the imaging system 12 (see Fig. 2) uses an analog output, a digital-to-analog converter 54 (see Fig. 2) receives readable digital signals from the digital signal processor 38 (see Fig. 2) and converts the digital signals readable in analog signals. As shown in Fig. 2, the controller algorithm 60, as in the form of the controller algorithm 60f, is configured to send a controller command 62, such as in the form of the controller command 62f, to a controller 63, such as a control memory controller 80. The control memory controller 80 (see Fig. 2) uses a control memory controller output 82 to control the image modification location 90 , such as the image modification location 90f, through the dynamic image masking process 11 (see Figs. 4A-4B). The image modification location 90f (see Fig. 2) is located at a digital signal processor output 40 (see Figs. 1, 2) of the focal plane network subsystem 22a (see Fig. 2) of FIG. imaging system 12 (see FIG. 2), and before entering the nonvolatile resulting memory 44 (see FIG. 2) of the multilevel security system 42 (see FIG. 2). At the image modification location 90f, the one or more pixels 126 (see Fig. 4B) may be masked, so that they are not written to the resulting non-volatile memory 44 (see Fig. 2). . With this control memory controller embodiment 80 (see FIG. 2), the control memory controller 80 (see FIG. 2) modifies the image 122 (see FIG. 4A) at the output of the subsystem focal plane array 22a (see FIG. 2). The one or more pixels 126 (see FIG. 4B) that need to be constrained or eliminated (masked) are determined by the controller algorithm 60 (see FIG. 2), such as the 60f controller algorithm (see FIG. Figure 2), then this pixel information is correlated to a location in image 122 (see Figure 4A). As a result, unwanted pixels are blocked and are not written to the resulting nonvolatile memory 44 (see FIG. 2). As shown in FIG. 1, the digital signal processor 38 provides the digital output 40 and outputs the digital output 40 to the resulting non-volatile memory 44 of the multi-level security system 42. As shown in FIG. use of the controller algorithm 60, such as the controller algorithm 60f, and the control memory controller 80 that uses the control memory controller output 82 to control the image modification location 90, such as that the image modification location 90f, through the dynamic image masking process 11 (see FIGS. 4A to 4B), the one or more pixels 126 (see FIG. 4B) are masked or modified in FIG. being blocked and not written to the resulting non-volatile memory 44. Thus, by the use of the controller algorithm 60 (see Fig. 2), such as the controller algorithm 60f (see Fig. 2), and the memory controller In FIG. 2, the masked digital output 40a (see FIG. 2) is provided to the resulting nonvolatile memory 44 (see FIG. 2) of the multi-level security system 42 (see FIG. 2). As shown in Fig. 2, the controller algorithm 60, as in the form of the controller algorithm 60g, is configured to send a controller command 62, such as in the form of the controller command 62g, to control the image modification location 90, such as the image modification location 90g, through the dynamic image masking process 11 (see Figs. 4A-4B). The image modification location 90g (see Fig. 2) is located in the multilevel security system 42 (see Fig. 2) between the resulting non-volatile memory 44 (see Fig. 2) and the post-processing process. 48 (see Figure 2). At the 90g image modification location, the one or more pixels 126 (see FIG. 4B) can be modified by overwriting the one or more pixels 126 (see FIG. 4B) with either zero saturation 140 (see FIG. 4B) is one hundred percent saturation 142 (see Figure 4B). With this embodiment, the image 122 (see Fig. 4A) is modified after it leaves the resulting non-volatile memory 44 (see Fig. 2) but before it is post-processed in the post process. -treatment 48 (see Figure 2). The unwanted pixels, as determined by the controller algorithm 60 (see FIG. 2), such as the controller algorithm 60g (see FIG. 2), are blocked by overwriting their digitized value 146 (see FIG. 4B) by an entity known to be a 0 (zero) or a value that represents 100% (one hundred percent) of the allowable value of pixel 126 (see FIG. 4B). As shown in FIG. 1, the resulting non-volatile memory 44 provides a nonvolatile resulting memory output 46 and outputs the resulting non-volatile memory output 46 to the post-processing process 48. As shown in FIG. use of the controller algorithm 60, such as the controller algorithm 60g, to control the image modification location 90, such as the image modification location 90g, through the process for dynamic image masking 11 (see FIGS. 4A-4B), the one or more pixels 126 (see FIG. 4B) are masked or modified. Thus, by the use of the controller algorithm 60 (see FIG. 2), such as the controller algorithm 60g (see FIG. 2), a resultant masked non-volatile memory output 46a (see FIG. 2) is output to the post-processing process 48 (see Figure 2). As shown in Fig. 2, the controller algorithm 60, as in the form of the controller algorithm 60h, is configured to send a controller command 62, such as in the form of a controller command 62h, to control the image modification location 90, such as the image modification location 90h, through the dynamic image masking process 11 (see Figs. 4A-4B). The image modification location 90h (see Fig. 2) is located in the multilevel security system 42 (see Fig. 2) at the post-processing process 48 (see Fig. 2). At the image modification location 90h, the one or more pixels 126 (see FIG. 4B) can be modified by editing or ignoring the one or more pixels 126 (see FIG. 4B) representing the excluded area 124 (see FIG. Figure 4B) of the designated area to be imaged 118a (see Figure 4B). With this embodiment, which is the post-processing phase 116 (see Fig. 4A), the one or more pixels 126 (see Fig. 4B) are masked at the post-processing process 48 (see Fig. 2). ). Image 122 (see Figure 4A) is essentially modified by simply editing or ignoring the one or more pixels 126 (see Figure 4B) that represent portions of an image 122 (see Figure 4A) that are undesirable, such as exclude area 124 (see Figure 4B). As shown in Fig. 1, the post-processing process 48 provides a post-processed output 49 of the multi-level security system 42. As shown in Fig. 2, by the use of the controller algorithm 60, such as 60h controller algorithm for controlling the image modification location 90, such as the image modification location 90h, through the dynamic image masking process 11 (see FIGS. 4A-4B ), the one or more pixels 126 (see FIG. 4B) are masked or modified. Thus, by the use of the controller algorithm 60 (see FIG. 2), such as the controller algorithm 60h (see FIG. 2), the post-processing process 48 provides a masked post-processed output 49a. of the multi-level security system 42 to obtain a masked image 50 (see FIG. 2), such as the filtered autonomous remote sensing image 51 (see FIG. 2). As shown in Fig. 2, the controller algorithm 60, as in the form of the controller algorithm 60i, is configured to send a controller command 62, such as in the form of a controller command 62i, to a controller 63, such as an analog signal controller 84. The analog signal controller 84 (see Fig. 2) uses an analog signal controller output 86 to control the image modification location 90, such as the image modification location 90i, through the dynamic image masking process 11 (see FIGS. 4A-4B). The image modification location 90i (see FIG. 2) is located at an analog output 56 (see FIG. 2) of the digital-to-analog converter 54 (see FIG. 2) of the focal plane network subsystem 22a (FIG. see Fig. 2), and before entering a video editing system 58 (see Fig. 2) outside the focal plane network subsystem 22a (see Fig. 2). The one or more pixels 126 (see Fig. 4B) are preferably modified at the image modification location 90i by masking the one or more pixels 126 (see Fig. 4B) so that they are not written on the video editing system 58 (see Figure 2). [0018] As shown in FIG. 1, the digital signal processor 38 provides the analog digital signal processor output 52 to the digital to analog converter 54, and the digital to analog converter 54 provides the analog output 56 of the network subsystem focal plane 22a and input the analog output 56 to the video editing system 58. As shown in Fig. 2, by the use of the controller algorithm 60, such as the controller algorithm 60i, and the controller of analog signal 84 which uses the analog signal controller output 86 to control the image modification location 90, such as the image modification location 90i, through the dynamic image masking process 11 (See FIGS. 4A-4B), the plurality of pixels 126 (see FIG. 4B) are masked or modified. Thus, by use of the controller algorithm 60 (see Fig. 2), such as the controller algorithm 60i (see Fig. 2) and the analog signal controller 84 (see Fig. 2), an output Masked analog 56a (see Fig. 2) is input to the video editing system 58 (see Fig. 2). As shown in FIG. 4B, the dynamic image masking system 10 further comprises a computer system 130 associated with the imaging system 12. The computer system 130 (see FIG. 4B) comprises the controller algorithm 60 (see FIG. FIGS. 2, 4B) configured to send controller commands 62 (see FIGS. 2, 4B) to one or more controllers 63 (see FIGS. 2, 4B) which control the one or more image modification locations 90 (FIGS. see Figures 2, 4B) with the dynamic image masking process 11 (see Figures 2, 4B). [0019] As shown in FIG. 4B, computer system 130 preferably includes a computer 132 and one or more of software 134, firmware 136, and hardware 138. Controller algorithms 60 and controllers 63 may preferably be a combination of hardware 138 and firmware 136, or a combination of hardware 138 and software 134. [0020] The software 134 (see FIG. 4B) or the firmware 136 (see FIG. 4B) can implement the controller algorithm 60 (see FIG. 3) designed to be used together with the computer 132 (see FIG. 4B). computer system 130 or other hardware 138 (see FIG. 4B) of the computer system 130 (see FIG. 4B). [0021] In another embodiment of the disclosure, there is provided a method 150 (see FIG. 5A) of providing a filtered autonomous remote sensing image 51 (see FIG. 4A) via a masking process of FIG. dynamic image 11 (see Figure 4B). Fig. 5A is an illustration of a flowchart of an embodiment of method 150 of the disclosure. [0022] As shown in FIG. 5A, the method 150 includes a step 152 of equipping a remote sensing platform 14 (see FIGS. 2, 4A) with an imaging system 12 (see FIGS. 2, 4A). The equipment step 152 of the remote sensing platform 14 (see FIGS. 2, 4A) with the imaging system 12 (see FIGS. 2, 4A) comprises the equipment of the remote sensing platform 14 (see Figs. 2, 4A) with the imaging system 12 (see Figs. 2, 4A) including an optical system 20 (see Fig. 2) including a digital camera 20a (see Fig. 2), and a camera system. image detection 22 (see FIGS. 2, 4A) including a focal plane network subsystem 22a (see FIG. 4A), a radar imaging system 22b (see FIG. 4A), a sonar imaging system 22c (see FIG. Fig. 4A), an infrared imaging system 22d (see Fig. 4A), an X-ray imaging system 22e (see Fig. 4A), or a light-wave detection and telemetry system (LIDAR) 22f (see Fig. 4A). ). [0023] The step 152 of the equipment of the remote sensing platform 14 (see FIGS. 2, 4A) with the imaging system 12 (see FIGS. 2, 4A) further comprises the equipment of the platform of remote sensing 14 (see FIGS. 2, 4A) comprising an airborne platform 14a (see FIG. 4A), a ground based platform 14b (see FIG. 4A), a platform based on the space 14c (FIG. see Figure 4A), or a water-based platform 14d (see Figure 4A). As shown in FIG. 5A, the method 150 includes a step 154 of designating an imaging area 118 (see FIG. 4A) to obtain a designated area to be imaged 118a (see FIG. 4A). As shown in Fig. 5A, the method 150 further comprises a step 156 of setting a plurality of reference points 120 (see Fig. 4A) on a surface 118 (see Fig. 4A) of the designated area to be imaged. 118a (see Figure 4A). As shown in Fig. 5A, the method 150 further comprises a step 158 of designating a plurality of specific surface areas 124a (see Fig. 4A) as excluded area 124 (see Fig. 4A) not to be imaged. with reference to the plurality of reference points 120 (see Figure 4A). As shown in Fig. 5A, the method 150 further comprises a step 160 of controlling a pre-established collection planning process 16 (see Figs. 2, 4A) covering the designated area to be imaged 118a (see Fig. 4A). As shown in FIG. 5A, the method 150 includes a step 162 of using a navigation system 110 (see FIG. 4A) comprising a Global Positioning System (GPS) 110a (see FIG. radio navigation 110b (see Fig. 4A), optical navigation system 110c (see Fig. 4A), inertial measurement unit (IMU) system 110d (see Fig. 4A), an inertial measurement unit system (UMI) equipped with a 110e magnetometer (see Fig. 4A), or one of their combinations, for positioning the imaging system 12 (see Figs. 2, 4A) for imaging the designated area to be imaged 118a (see Figure 4A). [0024] As shown in Fig. 5A, the method 150 further comprises a step 164 of using the imaging system 12 (see Figs. 2, 4A) to image the designated area 118a to image (see Fig. 4A) which is covered through the pre-established collection planning process 16 (see Figures 2, 4A). As shown in FIG. 5A, the method 150 comprises a step 166 of dynamic nullification of one or more pixels 126 (see FIG. 4B) in one or more images 122 (see FIG. 4A) of the excluded area 124 (see FIG. Figure 4A). Step 166 of dynamic nullification of one or more pixels 126 (see Fig. 4B) from one or more images 122 (see Fig. 4A) of excluded area 124 (see Fig. 4A) includes modification of one or more pixels several captured images 124b (see Figure 4B) of the excluded area 124 (see Figure 4A), making them indecipherable. In one embodiment, the modification of one or more captured images 124b (see FIG. 4A) of the excluded area 124 (FIG. 4A) is preferably performed in real time during imaging of the designated area to be imaged 118a. (see Figure 4A). In another embodiment, the modification of one or more captured images 124b (see FIG. 4A) of the excluded area 124 (see FIG. 4A) is performed after a global imaging of the designated area to be imaged 118a (see FIG. Fig. 4A) is completed, and before the filtered autonomous remote sensing image 51 (see Figs. 2, 4A) is obtained via the dynamic image masking process 11 (see Fig. 4B) of the area designated to image 118a (see Figure 4A). As shown in FIG. 5A, the method 150 comprises a step 168 of obtaining a filtered autonomous remote sensing image 51 (see FIGS. 2, 4A) via the dynamic image masking process 11 (see FIG. Figure 4B) of the designated area to be imaged 118a (see Figure 4A). In another embodiment of the disclosure, there is provided a method 170 (see FIG. 5B) for providing a filtered autonomous remote sensing image 51 (see FIG. 4A) via a masking process of FIG. dynamic image 11 (see Figure 4B). Figure 5B is an illustration of a flowchart of another embodiment of method 170 of the disclosure. As shown in FIG. 5B, the method 170 includes a step 172 of unmanned aerial vehicle (VASP) equipment 200 (see FIG. 6) with an imaging system 12 (see FIGS. 2, 4A). As shown in FIG. 5B, the method 170 further includes a step 174 of designating an imaging area 118 (see FIG. 4A) to obtain a designated area to be imaged 118a (see FIG. 4A). As shown in Fig. 5B, the method 170 further comprises a step 176 of setting a plurality of reference points 120 (see Fig. 4A) on a surface 118b (see Fig. 4A) of the designated area to be imaged. 118a (see Figure 4A). As shown in Fig. 5B, the method 170 further comprises a step 178 of designating a plurality of specific surface areas 124a (see Fig. 4A) as excluded area 124 (see Fig. 4A) not to be imaged. with reference to the plurality of reference points 120 (see Figure 4A). As shown in FIG. 5B, the method 170 further comprises a step 180 of controlling a pre-established flight plan 17 (see FIG. 4A) of the VASP 200 (see FIG. 6) covering the area designated to be imaged 118a (see FIG. Figure 4A). As shown in FIG. 5B, the method 170 further comprises a step 182 of using a navigation system 110 (see FIG. 4A) comprising a Global Positioning System (GPS) 110a (see FIG. 4A), a radio navigation system 110b (see FIG. 4A), an optical navigation system 110c (see FIG. 4A), an inertial measurement unit (IMU) system 110d (see FIG. inertial measurement (IMU) equipped with a magnetometer 110e (see FIG. 4A), or one of their combination, for positioning the imaging system 12 (see FIGS. 2, 4A) for imaging the zone designated to image 118a (see Figure 4A). [0025] As shown in FIG. 5B, the method 170 further comprises a step 184 of overflight by the VASP 200 (see FIG. 6) of the designated area to be imaged 118a (see FIG. 4A), and of use of the system of FIG. imaging 12 (see Figs. 2, 4A) to image the designated area to be imaged 118a (see Fig. 4A) which is covered by the pre-established flight plan 17 (see Fig. 4A) of the VASP 200 (see Fig. 6). As shown in FIG. 5B, the method 170 further comprises a step 186 of dynamically nullifying one or more pixels (see FIG. 4B) in one or more images 122 (see FIG. 4A) of the excluded area 124 (see FIG. Figure 4A). The step 186 of dynamically nullifying one or more pixels 126 (see FIG. 4B) in one or more images 122 (see FIG. 4A) of the excluded area 124 (see FIG. 4A) comprises the guidance of the pre-established flight plan. 17 (see Figure 4A) of the VASP 200 (see Figure 6) to avoid flying over the excluded area 124 (see Figure 4A). The dynamic nullification step 186 of one or more pixels 126 (see FIG. 4B) in the one or more images 122 (see FIG. 4A) of the excluded area 124 (see FIG. 4A) further comprises a time cancellation. real dynamic imaging by the image detection system 22 (see Figs. 2, 4A) when the VASP 200 (see Fig. 6) flies over the excluded area 124 (see Fig. 4A). The step 186 of dynamically nullifying one or more pixels 126 (see Fig. 4B) in one or more images 122 (see Fig. 4A) of the excluded area 124 (see Fig. 4A) further comprises the modification of one or more captured images 124b (see Figure 4B) of the excluded area 124 (see Figure 4A), making them indecipherable. In one embodiment, the modification of one or more captured images 124b (see FIG. 4A) of the excluded area 124 (FIG. 4A) is preferably performed in real time during imaging of the designated area to be imaged 118a. (see Figure 4A). In another embodiment, the modification of one or more captured images 124b (see FIG. 4A) of the excluded area 124 (see FIG. 4A) is performed after a global imaging of the designated area to be imaged 118a (see FIG. Fig. 4A) is completed, and before the filtered autonomous remote sensing image 51 (see Figs. 2, 4A) is obtained via the dynamic image masking process 11 (see Fig. 4B) of the area designated to image 118a (see Figure 4A). As shown in FIG. 5B, the method 170 further comprises a step 188 of obtaining a filtered autonomous remote sensing image 51 (see FIGS. 2, 4A) via the dynamic image masking process 11 (FIGS. see Figure 5B) of the designated area to be imaged 118a (see Figure 4A). FIG. 6 is an illustration of a schematic representation of an embodiment of a remote sensing platform 14, such as the airborne platform 14a, in the form of an unmanned aerial vehicle (VASP) 200 which can be used in one embodiment of the dynamic image masking system 10, the method 150 (see Fig. 5A), and the method 170 (see Fig. 5B) of the disclosure. As shown in FIG. 6, the remote sensing platform 14, such as the airborne platform 14a in the form of the VASP 200, includes the dynamic image masking system 10. As further shown in FIG. the VASP 200 comprises a nose 202, a fuselage 204, wings 206, and a stabilizer 208. Figure 7 is an illustration of a flow chart of an embodiment of an aircraft manufacturing and maintenance process. 300. FIG. 8 is an illustration of a functional block diagram of an embodiment of an aircraft 320. Referring to FIGS. 7 to 8, embodiments of the disclosure can be described in the context of FIG. an aircraft manufacturing and maintenance method 300, as shown in FIG. 7, and the aircraft 320, as shown in FIG. 8. During a pre-production, the example of a manufacturing and maintenance process of FIG. aircraft 300 (see FIG. 7) may include specification and design 302 (see FIG. re 7) of the aircraft 316 (see Figure 8) and the hardware supply 304 (see Figure 7). During manufacture, a component and subassembly fabrication 306 (see FIG. 7) and a system integration 308 (see FIG. 7) of the aircraft 316 (see FIG. 8) take place. After that, the aircraft 316 (see FIG. 8) can go through certification and delivery 310 (see FIG. 7) to be put into service 312 (see FIG. 7). While in service 312 (see Figure 7) by a customer, the aircraft 316 (see Figure 8) can be scheduled for periodic maintenance and servicing 314 (see Figure 7), which may also include services. modification, reconfiguration, revision, and other appropriate services. Each of the aircraft manufacturing and maintenance process processes 300 (see FIG. 7) can be performed or performed by a system integrator, a third party, and / or an operator (e.g., a customer). For the purposes of this description, a system integrator may include, without limitation, any number of aircraft manufacturers and major subcontractors, a third party may include, without limitation, any number of vendors, subcontractors, and suppliers ; and an operator may include an airline, a credit company, a military entity, a service organization, and other suitable operators. As shown in FIG. 8, the aircraft 320 produced by the exemplary aircraft manufacturing and maintenance method 300 may include a cell 322 with a plurality of systems 324 and an interior 326. As further shown in FIG. In Figure 8, examples of the systems 324 may include one or more of a propulsion system 328, an electrical system 330, a hydraulic system 332, and an air conditioning system 334. Any number of other systems may be included. Although an aerospace example is shown, the principles of disclosure can be applied to other industries, such as the automotive industry. Methods and systems integrated herein may be employed during one or more stages of the aircraft manufacturing and maintenance process 300 (see Figure 7). For example, components or subassemblies corresponding to the manufacture of components and subassemblies 306 (see FIG. 7) may be manufactured similarly to components or subassemblies produced while the aircraft 320 (see FIG. 8) is in service 312 (see Figure 7). Also, one or more embodiments of apparatuses, embodiments of methods, or one of their combinations may be used during the manufacture of components and subassemblies 306 (see FIG. 7) and the integration of system 308 (see Fig. 7), for example, substantially hastening the assembly or reducing the cost of the aircraft 320 (see Fig. 8). Similarly, one or more embodiments of apparatuses, embodiments of methods, or one of their combination, may be used while the aircraft 320 (see FIG. 8) is in service 312 (see FIG. Figure 7), for example and without limitation, for maintenance and service 312 (see Figure 7). Disclosed embodiments of the dynamic image masking system 10 (see FIGS. 2, 4A-4B), the method 150 (see FIG. 5A), and the method 170 (see FIG. 5B) provide numerous advantages through relationship to known systems and processes, including imaging only useful and desired data, and not imaging areas or data that are restricted, out of bounds, or out of context for remote sensing platform missions, such as airborne platform missions. This kind of "guaranteed shutter control" addresses potentially intrusive privacy issues and ensures that the dynamic image masking system 10 (see FIGS. 2, 4A-4B) is not overloaded. unnecessary data, such as data collected on non-client areas. In addition, disclosed embodiments of the dynamic image masking system 10 (see Figures 2, 4A-4B), method 150 (see Figure 5A), and method 170 (see Figure 5B) provide well-defined collection areas for imaging collection, and provide stand-alone operations, which are typically required for aerial remote sensing imagery collection in the entire precision agriculture market, such as overflight of cropland for determine the health and vigor of plants. Further, disclosed embodiments of the dynamic image masking system 10 (see FIGS. 2, 4A-4B), method 150 (see FIG. 5A), and method 170 (FIG. 5B) incorporate the imaging system 12 (see FIGS. 2, 4A) and automatic piloting of a remote sensing platform 14, such as an unmanned aerial vehicle (VASP) 200 (see FIG. 6), and can realize flight control and shutter operations for multiple VASP 200 simultaneously. [0026] In addition, disclosed embodiments of the dynamic image masking system 10 (see FIGS. 2, 4A to 4B), method 150 (see FIG. 5A), and method 170 (FIG. a reliable and repeatable masked image 50 (see FIGS. 2, 4A) which preferably leads to a fixed product only with the use of pixels 126 (see FIG. 4B) of interest. The pixels 126 (see FIG. 4B) may be uncollected, erased, crushed, saturated with light, or otherwise modified, so as to render the pixels 126 (see FIG. 4B) unnecessary in the product generation process. And this can be done anywhere in the product generation process, ranging from being "cleared" in the collection planning phase 112 (see Figure 4A), being crushed in the phase 114 (see Fig. 4A), to being post-processed in the post-processing phase 116 (see Fig. 4A) after the one or more images 122 (see Fig. 4A) are collected. [0027] Further, the disclosure includes embodiments according to the following clauses: Clause 1. Dynamic image masking system (10) for providing a filtered autonomous remote sensing image (51) through a process of dynamic image masking (11), the system comprising: a remote sensing platform (14); an imaging system (12) associated with the remote sensing platform (14), the imaging system (12) comprising: an optical system (20); An image detection system (22); a multi-level security system (42) associated with the imaging system (12); one or more image modification locations (90) located in the imaging system (12) and the multi-level security system (42), wherein a modification of the one or more images occurs via the masking process; dynamic image (11); and a computer system (130) associated with the imaging system (12), the computer system (130) comprising a controller algorithm (60) configured to send controller commands (62) to one or more controllers (63) that control the one or more image modification locations (90) through the dynamic image masking process (11). Clause 2. Dynamic image masking system (10) according to clause 1, further comprising a navigation system (110) comprising a global positioning system (GPS) (110a), a radio navigation system (110b), an optical navigation system (110c), an inertial measurement unit (IMU) system (110d), an inertial measurement unit (IMU) system equipped with a magnetometer (110e), or one of combinations thereof, for positioning the image system (12) to image a designated area to be imaged (118a). Clause 3. Dynamic image masking system (10) according to clause 1 or 2, wherein the remote sensing platform (14) comprises an airborne platform (14a), a ground-based platform ( 14b), a space-based platform (14c), or a water-based platform (14d). Clause 4. Dynamic image masking system (10) according to clauses 1, 2 or 3, wherein the optical system (20) comprises a camera (20a), including a digital camera (20b), and wherein the system image sensing device (22) comprises a focal plane array subsystem (22a), a radar imaging system (22b), a sonar imaging system (22c), an infrared imaging system (22d), a beam imaging system X (22e), or a light-wave detection and telemetry system (LIDAR) (22f). Clause 5. Dynamic image masking system (10) according to clauses 1, 2, 3 or 4, wherein the controller algorithm (60) is further configured to send a controller command (62) to a process predefined collection schedule (16) at an image modification location (90) at the localized pre-established collection planning process (16) before inputting to the imaging system (12), determining an excluded area (124) not to be imaged by the imaging system (12). Clause 6. Dynamic image masking system (10) according to clauses 1, 2, 3, 4 or 5, wherein the controller algorithm (60) is configured to send a controller command (62) either mechanically or optically to an optical blindness system (64) which controls an image modification location (90) located between the optical system (20) and the image detection system (22), the optical blindness system (64) comprising a mechanical shutter control device (66a) for inhibiting photon collection by one or more pixels (126), or comprising a laser optical device (67a) and micromirrors (67b) for illuminating one or more pixels (126a) ), causing blindness of one or more pixels (126). Clause 7. Dynamic image masking system (10) according to clauses 1, 2, 3, 4, 5 or 6, wherein the image detection system (22) comprises a focal plane network subsystem ( 22a) comprising: a focal plane array (26) which reads raw image data (24) from the optical system (20); an analog to digital converter (30) which receives the raw image data (24) of the focal plane array (26) and converts the raw image data (24) of analog signals into digital; a volatile temporary memory (34) which receives the digital signals (37) of the analog-to-digital converter (30) and temporarily stores the digital signals (37); a digital signal processor (38) which receives the digital signals (37) from the volatile temporary memory (34) and processes the digital signals (37) into a readable image format (39); and, when the imaging system (12) uses an analog output, a digital-to-analog converter (54) which receives readable digital signals from the digital signal processor (38) and converts the readable digital signals to analog signals. Clause 8. Dynamic image masking system (10) according to clause 7, wherein the controller algorithm (60) is configured to send a controller command (62) to a pixel controller that controls a modification location image (90) in the focal plane array, crushing one or more pixels on the focal plane array (26) with either zero saturation (140) or one hundred percent saturation (142). Clause 9. Dynamic image masking system (10) according to clause 7, wherein the controller algorithm (60) is configured to send a controller command (62) to a scan controller (72) which controls a image modification location (90) between the analog-to-digital converter (30) and the volatile temporary memory (34), setting a digitized value (146) for one or more pixels (126) to a minimum value ( 146a) or at a maximum value (146b). Clause 10. Dynamic image masking system (10) according to clause 7, wherein the controller algorithm (60) is configured to send a controller command (62) to a digital flow controller (73) that controls an image modification location (90) between the volatile temporary memory (34) and the digital signal processor (38), changing a single image (122) at a time and hiding one or more pixels (126) in the single image (122). Clause 11. Dynamic image masking system (10) according to clause 7, wherein the controller algorithm (60) is configured to send a controller command (62) to a control memory controller (80) which monitors an image modification location (90) located at a digital signal processor output (40) of the focal plane network subsystem (22a) and before inputting into a resulting non-volatile memory (44) of the system multi-level security device (42) by masking one or more pixels (126) so that they are not written to the resulting nonvolatile memory (44). Clause 12. Dynamic image masking system (10) according to clause 7, wherein the controller algorithm (60) is configured to send a controller command (62) to an analog signal controller (84) which controls an image modification location (90) located at a digital-to-analog converter output (56) of the focal plane array subsystem (22a) and before inputting into a video editing system (58), masking one or more pixels (126) so that they are not written to the video editing system (58). Clause 13. Dynamic image masking system (10) according to clauses 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11 or 12, wherein the controller algorithm (60) is configured to send a controller command (62) that controls an image modification location (90) located in the multi-level security system (42) between a nonvolatile resulting memory (44) and a post-processing process ( 48), by overwriting one or more pixels (126) with either zero saturation (140) or one hundred percent saturation (142). Clause 14. Dynamic image masking system (10) according to clauses 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12 or 13, wherein the controller algorithm ( 60) is configured to send a controller command (62) which controls an image modification location (90) at a post-processing process (48) of the multi-level security system (42), by editing or ignoring one or more pixels (126) representing an excluded area (124) of a designated area to be imaged (118a). [0028] Clause 15. A method of providing a filtered autonomous remote sensing image (51) through a dynamic image masking process (11), the method comprising the steps of: remote sensing (14) with an imaging system (12); designating an imaging area (118) to obtain a designated area to be imaged (118a); establishing a plurality of reference points (120) on a surface (118b) of the designated area to be imaged (118a); designating a plurality of specific surface areas (124a) as an excluded area (124) not to be imaged with reference to the plurality of reference points (120); monitoring a pre-established collection planning process (16) covering the designated area to be imaged (118a); use of a navigation system (110) comprising a global positioning system (GPS) (110a), a radio navigation system (110b), an optical navigation system (110c), an inertial measurement unit system ( UMI) (110d), an inertial measurement unit (IMU) system equipped with a magnetometer (110e), or one of their combinations, for positioning the imaging system (12) to image the designated area to be imaged (118a); using the imaging system (12) to image the designated area to be imaged (118a) that is covered through the pre-established collection planning process (16); dynamically nullifying one or more pixels (126) in one or more images (122) of the excluded area (124); and obtaining a filtered autonomous remote sensing image (51) through the dynamic image masking process (11) of the designated area to be imaged (118a). Clause 16. A method according to clause 15, wherein the step of equipping the remote sensing platform (14) with the imaging system (12) comprises the equipment of the remote sensing platform (14). ) with the imaging system (12) comprising an optical system (20) including a digital camera (20a), and an image sensing system (22) including a focal plane array subsystem (22a), a radar imaging system (22b), a sonar imaging system (22c), an infrared imaging system (22d), an X-ray imaging system (22e), or a light-wave detection and telemetry system (LIDAR) (22f) ). Clause 17. A method according to clause 15 or 16, wherein the step of equipping the remote sensing platform (14) with the imaging system (12) includes the equipment of the remote sensing platform ( 14) comprising an airborne platform (14a), a ground based platform (14b), a space based platform (14c), or a water based platform (14d) . Clause 18. A method according to clauses 15, 16 or 17, wherein the step of dynamically nullifying one or more pixels (126) from one or more images (122) of the excluded area (124) comprises modifying the one or more captured images (124b) of the excluded area (124), making them indecipherable. Clause 19. A method according to clause 18, wherein the modification of one or more captured images (124b) of the excluded area (124) is performed in real time during imaging of the designated area to be imaged (118a). Clause 20. A method for providing a filtered autonomous remote sensing image (51) through a dynamic image masking process (11), the method comprising the following steps: equipping an unmanned aerial vehicle (VASP) (200) with an imaging system (12); designating an imaging area (118) to obtain a designated area to be imaged (118a); establishing a plurality of reference points (120) on a surface (118b) of the designated area to be imaged (118a); designating a plurality of specific surface areas (124a) as an excluded area (124) not to be imaged with reference to the plurality of reference points (120); controlling a pre-established flight plan (17) of the VASP (200) covering the designated area to be imaged (118a); use of a navigation system (110) comprising a global positioning system (GPS) (110a), a radio navigation system (110b), an optical navigation system (110c), an inertial measurement unit system ( UMI) (110d), an inertial measurement unit (IMU) system equipped with a magnetometer (110e), or one of their combinations, for positioning the imaging system (12) to image the designated area to be imaged (118a); overflight by the VASP (200) of the designated area to be imaged (118a), and use of the imaging system (12) to image the designated area to be imaged (118a) that is covered by the pre-established flight plan (17) of the VASP (200); dynamically nullifying one or more pixels (126) in one or more images (122) of the excluded area (124); and obtaining a filtered autonomous remote sensing image (51) through the dynamic image masking process (11) of the designated area to be imaged (118a). [0029] Clause 21. A method according to clause 20, wherein the step of dynamically nullifying one or more pixels (126) in one or more images (122) of the excluded area (124) comprises the guidance of the pre-established flight plan (17). ) of the VASP (200) to avoid overflight of the excluded area (124). [0030] Clause 22. A method according to clause 20 or 21, wherein the step of dynamically nullifying one or more pixels (126) of one or more images (122) of the excluded area (124) comprises a dynamic real-time cancellation imaging by the image sensing system (22) when the VASP (200) flies over the excluded area (124). Clause 23. A method according to clauses 20, 21 or 22, wherein the step of dynamically nullifying one or more pixels (126) from one or more images (122) of the excluded area (124) comprises modifying the one or more captured images (124b) of the excluded area (124), making them indecipherable. Clause 24. A method according to clause 23, wherein modifying one or more captured images (124b) of the excluded area (124) is performed in real time during imaging of the designated area to be imaged (118a). Clause 25. A method according to clause 23, wherein the modification of one or more captured images (124b) of the excluded area (124) is performed after a global imaging of the designated area to be imaged (118a) is completed, and before the filtered autonomous remote sensing image (51) is obtained via the dynamic image masking process (11) of the designated area to be imaged (118a). Many modifications and other embodiments of the disclosure will be apparent to those skilled in the art to which the present disclosure relates by benefiting from the teachings presented in the foregoing descriptions and associated drawings. The embodiments described herein are intended to be illustrative and are not intended to be limiting or exhaustive. Although specific terms are used here, they are used in a generic and descriptive sense only and not for purposes of limitation.
权利要求:
Claims (14) [0001] REVENDICATIONS1. A dynamic image masking system (10) for providing a filtered autonomous remote sensing image (51) via a dynamic image masking process (11), the system comprising: a remote sensing platform (14); an imaging system (12) associated with the remote sensing platform (14), the imaging system (12) comprising: an optical system (20); an image detection system (22); a multi-level security system (42) associated with the imaging system (12); one or more image modification locations (90) located in the imaging system (12) and the multi-level security system (42), wherein a modification of the one or more images occurs via the masking process of dynamic image (11); and a computer system (130) associated with the imaging system (12), the computer system (130) comprising a controller algorithm (60) configured to send controller commands (62) to one or more controllers (63) that control the one or more image editing locations (90) through the dynamic image masking process (11). [0002] The dynamic image masking system (10) according to claim 1, further comprising a navigation system (110) comprising a Global Positioning System (GPS) (110a), a radio navigation system (110b), a optical navigation system (110c), an Inertial Measurement Unit (IMU) system (110d), an Inertial Measurement Unit (IMU) system equipped with a magnetometer (110e), or a combination thereof , for positioning the imaging system (12) to image a designated area to be imaged (118a). [0003] The dynamic image masking system (10) according to claims 1 or 2, wherein the remote sensing platform (14) comprises a frosted platform (14a), a ground based platform (14b), a space-based platform (14c), or a water-based platform (14d). [0004] The dynamic image masking system (10) according to claims 1, 2 or 3, wherein the optical system (20) comprises a camera (20a), including a digital camera (20b), and wherein the system image sensing device (22) comprises a focal plane array subsystem (22a), a radar imaging system (22b), a sonar imaging system (22c), an infrared imaging system (22d), a beam imaging system X (22e), or a light-wave detection and telemetry system (LIDAR) (22f). [0005] The dynamic image masking system (10) according to claims 1, 2, 3 or 4, wherein the controller algorithm (60) is further configured to send a controller command (62) to a process of pre-established collection schedule (16) at an image modification location (90) at the localized pre-established collection planning process (16) before inputting to the imaging system (12), determining an excluded area (124) not to be imaged by the imaging system (12). 20 [0006] A dynamic image masking system (10) according to claims 1, 2, 3, 4 or 5, wherein the controller algorithm (60) is configured to send a controller command (62) either mechanically or optically to an optical blindness system (64) which controls an image modification location (90) located between the optical system (20) and the image detection system (22), the optical blindness system (64) comprising a mechanical shutter control device (66a) for inhibiting photon collection by one or more pixels (126), or comprising a laser optical device (67a) and micromirrors (67b) for illuminating one or more pixels (126a) ), causing blindness of one or more pixels (126). 30 [0007] A dynamic image masking system (10) according to claims 1, 2, 3, 4, or 6, wherein the image detection system (22) comprises a focal plane array subsystem (22a) comprising: a focal plane array (26) which reads raw image data (24) from the optical system (20); an analog to digital converter (30) which receives the raw image data (24) of the focal plane array (26) and converts the raw image data (24) of analog signals into digital; a volatile temporary memory (34) which receives the digital signals (37) of the analog-to-digital converter (30) and temporarily stores the digital signals (37); a digital signal processor (38) which receives the digital signals (37) from the volatile temporary memory (34) and processes the digital signals (37) into a readable image format (39); and, when the imaging system (12) uses an analog output, a digital-to-analog converter (54) which receives readable digital signals from the digital signal processor (38) and converts the readable digital signals to analog signals. [0008] The dynamic image masking system (10) according to claims 1, 2, 3, 4, 5, 6 or 7, wherein the controller algorithm (60) is configured to send a controller command (62) which controls an image modification location (90) located in the multi-level security system (42) between a resulting non-volatile memory (44) and a post-processing process (48), by overwriting one or more pixels ( 126) with either zero saturation (140) or one hundred percent saturation (142). 25 [0009] A dynamic image masking system (10) according to claims 1, 2, 3, 4, 5, 6, 7 or 8, wherein the controller algorithm (60) is configured to send a controller command ( 62) which controls an image modification location (90) at a post-processing process (48) of the multi-level security system (42), by editing or ignoring one or more pixels (126) representing a plurality of pixels excluded area (124) of a designated area to be imaged (118a). [0010] A method of providing a filtered autonomous remote sensing image (51) via a dynamic image masking process (11), the method comprising the steps of: equipping a remote sensing platform (14) with an imaging system (12); designating an imaging area (118) to obtain a designated area to be imaged (118a); establishing a plurality of reference points (120) on a surface (118b) of the designated area to be imaged (118a); designating a plurality of specific surface areas (124a) as an excluded area (124) not to be imaged with reference to the plurality of reference points (120); monitoring a pre-established collection planning process (16) covering the designated area to be imaged (118a); use of a navigation system (110) comprising a global positioning system (GPS) (110a), a radio navigation system (110b), an optical navigation system (110c), an inertial measurement unit system ( UMI) (110d), an inertial measurement unit (IMU) system equipped with a magnetometer (110e), or one of their combinations, for positioning the imaging system (12) to image the designated area to be imaged (118a); using the imaging system (12) to image the designated area to be imaged (118a) that is covered through the pre-established collection planning process (16); dynamically nullifying one or more pixels (126) in one or more images (122) of the excluded area (124); and obtaining a filtered autonomous remote sensing image (51) through the dynamic image masking process (11) of the designated area to be imaged (118a). [0011] The method of claim 10, wherein the step of equipping the remote sensing platform (14) with the imaging system (12) comprises equipping the remote sensing platform (14) with the imaging system (12) comprising an optical system (20) including a digital camera (20a), and an image sensing system (22) including a focal plane array subsystem (22a), an imaging system radar (22b), a sonar imaging system (22c), an infrared imaging system (22d), an X-ray imaging system (22e), or a light-wave detection and telemetry system (LIDAR) (22f). [0012] The method of claims 10 or 11, wherein the step of equipping the remote sensing platform (14) with the imaging system (12) comprises the equipment of the remote sensing platform (14). ) comprising an airborne platform (14a), a ground based platform (14b), a space based platform (14c), or a water based platform (14d). [0013] The method of claims 10, 11 or 12, wherein the step of dynamically nullifying one or more pixels (126) from one or more images (122) of the excluded area (124) comprises modifying a or several captured images (124b) of the excluded area (124), rendering them indecipherable. [0014] The method of claim 13, wherein modifying one or more captured images (124b) of the excluded area (124) is performed in real time during imaging of the designated area to be imaged (118a).
类似技术:
公开号 | 公开日 | 专利标题 FR3026540A1|2016-04-01|SYSTEM AND METHOD FOR MASKING DYNAMIC IMAGE Cruzan et al.2016|Small unmanned aerial vehicles | in plant ecology O’Connor et al.2017|Cameras and settings for aerial surveys in the geosciences: Optimising image data Lucieer et al.2014|HyperUAS—Imaging spectroscopy from a multirotor unmanned aircraft system WO2016203151A1|2016-12-22|System and method for automatically inspecting surfaces US20150254738A1|2015-09-10|Systems and methods for aerial imaging and analysis Fazeli et al.2016|Evaluating the potential of RTK-UAV for automatic point cloud generation in 3D rapid mapping Cimoli et al.2017|Application of low-cost UASs and digital photogrammetry for high-resolution snow depth mapping in the Arctic Roth et al.2018|PhenoFly Planning Tool: flight planning for high-resolution optical remote sensing with unmanned areal systems FR3003380A1|2014-09-19|METHOD FOR MONITORING THE VEGETATION STATE AT THE ABORDIES OF AN INFRASTRUCTURE FR3003377A1|2014-09-19|PROCESS FOR ANALYZING A CULTIVATED AGRICULTURAL PLANT FR3006296A1|2014-12-05|DRONE COMPRISING A MULTISPECTRAL IMAGE DEVICE FOR THE GENERATION OF MAPS REPRESENTING A PLANT STATE OF A CULTURE Yang et al.2015|Low-cost single-camera imaging system for aerial applicators Wierzbicki et al.2018|Method of radiometric quality assessment of NIR images acquired with a custom sensor mounted on an unmanned aerial vehicle EP1936330A1|2008-06-25|Method and system for processing and viewing images of the surroundings of an aircraft EP1570433B1|2006-05-17|Image processing device with recognition and selection of light sources US10684166B2|2020-06-16|Information processing device, information processing method, and program Szabó et al.2018|Zooming on aerial survey FR3003356A1|2014-09-19|METHOD FOR OBSERVING A ZONE USING A DRONE Schedl et al.2020|Airborne optical sectioning for nesting observation Howell et al.2018|Structure from motion techniques for estimating the volume of wood chips US10891483B2|2021-01-12|Texture classification of digital images in aerial inspection JPWO2019026619A1|2020-08-20|Image processing apparatus, image processing method, and program JPWO2019208537A1|2021-06-10|Information processing device JPWO2016189853A1|2018-03-22|Image processing apparatus, image processing system, image processing method, and computer program
同族专利:
公开号 | 公开日 US20170018058A1|2017-01-19| CN105526916B|2020-10-02| US9846921B2|2017-12-19| FR3026540B1|2019-03-29| CN105526916A|2016-04-27| JP6629019B2|2020-01-15| JP2017027571A|2017-02-02|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 JP2001023072A|1999-07-05|2001-01-26|Nippon Signal Co Ltd:The|Road traffic information providing system| US7019777B2|2000-04-21|2006-03-28|Flight Landata, Inc.|Multispectral imaging system with spatial resolution enhancement| JP2003173449A|2001-12-06|2003-06-20|Dowa Koei Kk|Remote sensing supporting device, its program and program recording medium| US6897446B2|2002-03-26|2005-05-24|Lockheed Martin Corporation|Method and system for target detection using an infra-red sensor| JP2004030460A|2002-06-27|2004-01-29|Starlabo Corp|Image processing method, image processing program and recording medium with the same program recorded thereon| JP4508753B2|2003-07-12|2010-07-21|エルジーエレクトロニクスインコーポレイティド|Camera photographing restriction system and method for portable terminal| KR100652619B1|2003-07-18|2006-12-01|엘지전자 주식회사|Usage restriction system and method for digital camera adapted to mobile terminal| US8918540B2|2005-09-26|2014-12-23|The Boeing Company|Unmanned air vehicle interoperability agent| US20070070894A1|2005-09-26|2007-03-29|Fan Wang|Method to determine a scheduling priority value for a user data connection based on a quality of service requirement| JP4356733B2|2006-11-09|2009-11-04|アイシン精機株式会社|In-vehicle image processing apparatus and control method thereof| US8218868B2|2007-06-13|2012-07-10|Sensors Unlimited, Inc.|Method and apparatus for enhancing images| US9041915B2|2008-05-09|2015-05-26|Ball Aerospace & Technologies Corp.|Systems and methods of scene and action capture using imaging system incorporating 3D LIDAR| US8731234B1|2008-10-31|2014-05-20|Eagle View Technologies, Inc.|Automated roof identification systems and methods| US20140347482A1|2009-02-20|2014-11-27|Appareo Systems, Llc|Optical image monitoring system and method for unmanned aerial vehicles| US8713215B2|2009-05-29|2014-04-29|Z Microsystems, Inc.|Systems and methods for image stream processing| US8266333B1|2009-05-29|2012-09-11|Z Microsystems, Inc.|System and method for parallel image processing and routing| US9163909B2|2009-12-11|2015-10-20|The Boeing Company|Unmanned multi-purpose ground vehicle with different levels of control| WO2012044297A1|2010-09-30|2012-04-05|Empire Technology Development Llc|Automatic flight control for uav based solid modeling| US9086484B2|2011-06-30|2015-07-21|The Boeing Company|Context-based target recognition| WO2014031557A1|2012-08-20|2014-02-27|Drexel University|Dynamically focusable multispectral light field imaging| JP6055274B2|2012-10-31|2016-12-27|株式会社トプコン|Aerial photograph measuring method and aerial photograph measuring system| DE102013019488A1|2012-11-19|2014-10-09|Mace Wolf|PHOTO WITH PROTECTION OF THE PRIVACY| US20140312165A1|2013-03-15|2014-10-23|Armen Mkrtchyan|Methods, apparatus and systems for aerial assessment of ground surfaces| EP3069509A4|2013-11-14|2017-09-20|KSI Data Sciences, Inc.|A system and method for managing and analyzing multimedia information|AU2015374040A1|2014-12-31|2017-07-13|AirMap, Inc.|System and method for controlling autonomous flying vehicle flight paths| US10029804B1|2015-05-14|2018-07-24|Near Earth Autonomy, Inc.|On-board, computerized landing zone evaluation system for aircraft| TW201836020A|2017-02-17|2018-10-01|日商半導體能源研究所股份有限公司|Semiconductor device and method for manufacturing semiconductor device| CN108513567A|2017-03-23|2018-09-07|深圳市大疆创新科技有限公司|The method and unmanned vehicle of image co-registration| US10606271B2|2017-07-17|2020-03-31|The Boeing Company|Magnetic navigation and positioning system| US10922431B2|2017-12-27|2021-02-16|Honeywell International Inc.|Systems and methods for dynamically masking video and images captured by a drone device camera|
法律状态:
2016-09-26| PLFP| Fee payment|Year of fee payment: 2 | 2017-09-25| PLFP| Fee payment|Year of fee payment: 3 | 2018-08-03| PLSC| Publication of the preliminary search report|Effective date: 20180803 | 2018-09-25| PLFP| Fee payment|Year of fee payment: 4 | 2019-09-25| PLFP| Fee payment|Year of fee payment: 5 | 2020-09-25| PLFP| Fee payment|Year of fee payment: 6 | 2021-09-27| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US14500589|2014-09-29| US14/500,589|US9846921B2|2014-09-29|2014-09-29|Dynamic image masking system and method| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|