![]() DRONE EQUIPPED WITH A VIDEO CAMERA AND MEANS FOR COMPENSATING THE ARTIFACTS PRODUCED AT THE MOST IMP
专利摘要:
The drone comprises a camera, an inertial unit measuring the angles of the drone, and an extractor module delivering data of a reduced image area (ZI) defined within a capture zone (ZC) of the sensor. A servo module dynamically modifies the position and orientation of the image area within the capture area, in a direction opposite to that of the angle variations measured by the inertial unit. The sensor can operate in a plurality of different dynamically selectable configurations, with a basic configuration using a basic capture area (ZCB) for low roll angle values (θ), and at least one degraded mode configuration using an extended capture zone (ECA) larger than the basic capture area (ZCB), for high values of roll angle (θ). 公开号:FR3032052A1 申请号:FR1550568 申请日:2015-01-26 公开日:2016-07-29 发明作者:Pierre Eline 申请人:Parrot SA; IPC主号:
专利说明:
[0001] The invention relates to the processing of digital images captured by a camera embedded in a mobile device, in particular in a motorized flying machine such as a drone. The invention advantageously applies to the images collected by the front camera of a rotary wing drone such as a quadrocopter. The AR.Drone 2.0 or the Bebop Drone of Parrot SA, Paris, France are typical examples of such quadricopters. They are equipped with a series of sensors (accelerometers, 3-axis gyrometers, altimeters), a frontal camera capturing an image of the scene towards which the drone is directed and a vertical aiming camera capturing an image of the terrain overflown. . They are provided with multiple rotors driven by respective engines controllable in a differentiated manner to control the drone attitude and speed. Various aspects of these drones are described in particular in WO 2010/061099 A2, EP 2 364 757 A1, EP 2 613 213 A1 or EP 2 613 214 A1 (Parrot SA). The front-facing video camera is usable for driving "in immersive mode, that is to say where the user uses the image of the camera in the same way as if he were himself on board the It can also be used to capture sequences of images from a scene to which the drone is moving, allowing the user to use the drone in the same way as a camera or camcorder. Instead of being held in the hand, it would be carried by the drone.The images collected can be recorded and broadcast, posted on web sites hosting video sequences, sent to other Internet users, shared on video. social networks, etc. These images are intended to be recorded and communicated, it is desirable that they have the least possible defects, including defects caused by the behavior of the drone, which can induce oscillations, deformations and other artifacts untimely on the image However, the appearance of these artifacts is inherent in the drone's mode of evolution, since any linear movement of the drone forwards, backwards or to the side involves a tilting of the drone, and therefore a corresponding, undesirable effect of shift, rotation, oscillation ... of the image collected by the camera. [0002] These defects are tolerable in an "immersive driving" configuration, but if it is a question of using the drone in the manner of a mobile video camera to capture sequences that will be recorded and retrieved later, these These defects are extremely troublesome, so that it is desirable to reduce them to a minimum.In the case of the abovementioned Bebop Drone, this uses a camera equipped with a hemispherical fisheye field lens covering a field. about 180 °, but only a portion of the captured field is used, this portion corresponding approximately to the angular sector captured by a conventional camera.For this purpose, a particular window (hereinafter "image area") is selected in the raw image delivered at the output of the sensor (hereinafter "capture zone", which is itself a portion of the overall hemispherical image formed on the surface of the sensor). be a moving window 15 is in rotation and in translation, moved per- manently according to the movements of the UAV determined by the inertial unit, and in the opposite direction of detected motion. The image collected by the fisheye lens certainly undergoes the same oscillatory and rotational motions as a conventional camera, but the displacement of the image area is slaved to compensate for these motions and thereby produce a stabilized image at the same time. respect of the movements of the drone. A "virtual camera" is thus defined by extracting from the captured scene (the capture zone) a particular zone (the image zone) which is dynamically displaced, in rotation and in translation, in the capture zone 25 in the opposite direction of the movements drone to annihilate the oscillations that otherwise would be observed on the image. This technique is described in application FR 14 56302 of July 2, 2014 for "rotary-wing drone equipped with a video camera delivering stabilized image sequences", in the name of the Applicant. [0003] The present invention aims at the elimination of a particular defect which appears during certain evolutions of the drone, when this one is strongly inclined on the side. This situation occurs in particular during the abrupt translations of the drone to the right or to the left, or during very tight turns: these movements are created by, in particular, pivoting around the axis 3032052 3 roll of the drone, pivoting which are reflected in the image by rotations in one direction or the other of the scene captured by the camera These rotations can be offset by a rotation in the opposite direction of the image area in the capture area. However, it appears that, when the inclination angles of the drone are large, there is no position of the image area that allows to completely generate a rectified view. Indeed, if the rotation of the image zone is too great, opposite corners or edges of this image zone will "overflow" from the capture zone defined in the hemispherical image, which will result in the fugitive appearance of gray corners or edges on the rectified image delivered at the output of the correction module. These gray areas will certainly disappear when the drone regains a less inclined attitude, but an annoying artifact has been introduced and will remain visible on a sequence of images to be recorded and reproduced later. [0004] This situation also occurs when the user holds the drone by hand with the engines stopped, and uses it in the manner of a conventional camera to film a scene while benefiting from the dynamic stabilization of the system: tilts the drone beyond a certain threshold, areas with missing pixels appear at the borders of the image, even though the stabilization system gives the illusion that the camera has not been tilted. The starting point of the invention consists essentially in providing several operating configurations of the sensor, and in dynamically selecting the most appropriate configuration according to the angle of inclination of the drone, so as to avoid, during the tilt correction processing, any overflow of the image area beyond the region of the sensor initially selected to produce the image. By "operating configuration" is meant a set of camera capture parameters, including: - the output resolution, i.e. the size in pixels of the image outputted by the camera (image area); the capture resolution, that is to say the size expressed in pixels of the region acquired on the surface of the sensor (capture zone); and the capture mode, i.e., the processing for changing from the capture resolution to the output resolution. [0005] 3032052 4 The capture mode notably includes so-called "normal" modes (where all the acquired pixels are transmitted as they are output, the output resolution and the capture resolution being identical), "binning" (where the acquired pixels are grouped together). macropixels delivered output), and "scaling" (where the 5 pixels are merged with application of a filter taking into account for each pixel its neighbors on the image). In particular, binning and scaling techniques make it possible, compared to the normal mode, to produce an image from a larger captured area, but at the expense of a loss of definition. [0006] Usually, these capture parameters are set during the initialization of the camera. Specifically, with a configuration corresponding to that of the aforementioned Bebop Drone, it is possible in normal mode to compensate roll angles of ± 15 ° without introducing areas devoid of pixels. [0007] The problem of the invention is to increase the stabilization range beyond this range without loss of pixels (that is to say without appearance of edges or gray corners) and without degradation of the fluidity of the video stream. (ie constant framerate). To do this, the invention essentially proposes to use several operating configurations of the sensor and to select dynamically during shooting, depending on the roll angle of the drone, the most suitable configuration to generate a complete rectified view with optimal image quality. The principle consists in playing, for example, on different modes of the sensor 25 (normal-binning-scaling) so as to temporarily enlarge (for example by switching from normal mode to a binning or scaling mode) the capture surface when the drone is strongly inclined . This modification will be done at the cost of a transient deterioration of quality (because of the implementation of the binning or scaling mode) but without the introduction of artifact (gray edges or corners) and without modifying the framerate, therefore without degradation of the fluidity of the video stream. More specifically, the invention proposes a motorized flying machine as described above, that is to say comprising, in a manner known in itself: a camera linked to the body of the drone, with a lens and a sensor digital collecting the image formed by the lens; An inertial unit capable of measuring the Euler angles describing the instantaneous attitude of the drone with respect to an absolute terrestrial reference; an extractor module, receiving as input pixel data of a capture area on the sensor and outputting 5-pixel data of a reduced image area defined within the capture area; and a servo module, receiving as input values of angles delivered by the inertial unit, and able to dynamically modify the position and the orientation of the image zone inside the capture zone, in a direction opposite to that of the changes of the values of the angles measured by the inertial unit. Characteristically, according to the invention: the sensor is a sensor that can operate according to a plurality of different dynamically selectable operating configurations during the same camera-shooting sequence, said operating configurations comprising: a basic configuration using a basic capture area as the capture area, and - at least one degraded mode configuration using as capture area an extended capture area larger than the base capture area, with application to the pixels of the extended capture area of a conversion process adapted to reduce the size of the extended capture area to those of the base capture area, and there is further provided a compensator module, receiving as input successive values of roll angle of the drone delivered by the inertial unit, and able to dynamically modify the operating configuration neck sensor in such a way that the sensor operates: - in the basic configuration for roll angle values lower than a first limit, and - in a degraded mode configuration for roll angle values greater than a second limit. Preferably, the roll-effect compensating means are able to dynamically change the current operating configuration of the sensor from the basic configuration to the de-graded mode configuration as the value of the roll angle increases. beyond a first predetermined threshold, and from the degraded mode configuration to the basic configuration when the roll angle of the drone decreases below a second predetermined threshold, the second threshold being different from the first threshold and lower than this one. The operating configurations may include in particular: the basic configuration, using the basic capture area as the capture area, a moderately degraded mode configuration using as the capture area a first extended capture area larger than the area of capture. basic capture, with application to the pixels of the first extended capture area of a first conversion process able to reduce the dimensions of the first capture area extended to those of the base capture area, and a mode configuration strongly degraded using as capture area a second extended capture area of larger size than the first extended capture area, with application to the pixels of the second extended capture area of a second conversion treatment, different from the first treatment, suitable for reduce the dimensions of the second extended capture area to those of the basic capture area. The compensator module then dynamically modifies the current operating configuration of the sensor such that the sensor operates: according to the basic configuration for roll angle values less than the first limit, depending on the configuration in the medium degraded mode for roll angle values greater than the second limit and less than a third limit, and in the heavily degraded mode configuration for roll angle values greater than the third limit. An embodiment of the invention will now be described with reference to the accompanying drawings, in which the same references designate elements that are identical or functionally similar from one figure to another. [0008] Figure 1 is an overview showing the drone and the associated remote control apparatus for remote control thereof. Figures 2a and 2b illustrate the changes in camera sighting directions caused by a forward tilting of the drone, for example during an acceleration phase. Figure 3 is an example of an image formed on the camera sensor of the drone, illustrating the successive steps (a) to (d) of windowing and correction of the distortions applied to this image so as to produce a rectified image. [0009] Figures 4a and 4b illustrate the phenomenon occurring when the roll angle of the drone is large and that there is no position of the capture window for generating a complete rectified view. Figure 5 illustrates in block diagram form the various elements for performing, according to the invention, the dynamic selection of the operating configurations of the sensor. Figure 6 is a schematic representation illustrating the selection between two different configurations possible depending on the roll angle of the drone. Figure 7 is a schematic representation illustrating the selection between three different possible configurations depending on the wheel angle of the drone. FIGS. 8a to 8e are views explaining the successive steps of modifying the operating configuration of the sensor, for an example of oscillation movement in roll of the drone starting from the horizontal and with a return to the attitude original. [0010] We will now describe an example of implementation of the invention. [0011] In Figure 1, reference numeral 10 generally denotes a drone, which is for example a quadricopter such as the Bebop Drone model of Parrot SA, Paris, France. This drone comprises four coplanar rotors 12 whose engines are controlled independently by an integrated navigation system and attitude control. It is provided with a front-end camera 14 for obtaining an image of the scene towards which the drone is oriented. The drone also comprises a vertical aiming camera (not shown) pointing downwards, capable of capturing successive images of the terrain overflown and used in particular to evaluate the speed of the drone relative to the ground. Inertial sensors (accelerometers and gyrometers) make it possible to measure the angular velocities and attitude angles of the drone, ie the angles of Euler (pitch (p, roll 9 and yaw tif) with a certain accuracy). describing the inclination of the drone with respect to a horizontal plane of a fixed terrestrial marker, an ultrasonic range finder disposed under the drone also provides a measurement of the altitude relative to the ground, and the drone 10 is controlled by a device remote control unit 16 provided with a touch screen 18 displaying the image embedded by the front camera 15 14, with in superposition a certain number of symbols allowing the activation of control commands by simply touching the finger 20 of a The apparatus 16 is provided with radio link means with the drone, for example of the Wi-Fi local network type (IEEE 802.11), for the bidirectional exchange of data from the drone 10 to 20 l. 16, in particular for the transmission of the image captured by the camera 14, and the camera 16 to the drone 10 for sending pilot commands. The remote control device 16 is also provided with inclination sensors making it possible to control the attitude of the drone by printing the device 25 with corresponding inclinations along roll and pitch axes, it being understood that the two longitudinal and transverse components the horizontal velocity of the drone 10 will be intimately related to the inclination along the two respective axes of pitch and roll. The piloting of the drone consists in making it evolve by: a) rotating around a pitch axis 22, to advance or retract it, b) rotation about a roll axis 24, to shift it to the right or to the left; c) rotating about a yaw axis 26, to rotate the main axis of the drone to the right or to the left; and 3032052 9 d) translation downwards or upwards by changing the speed of the gasses so as to respectively reduce or increase the altitude of the drone. When these control commands are applied by the user from the remote control apparatus 16, the commands a) and b) pivoting about the pitch axes 22 and the rolling axis 24 are obtained by inclinations of the 16, respectively, about its longitudinal axis 28 and its transverse axis 30: for example, to advance the drone simply tilt the remote control device 16 forward by leaning it around the axis 28 in order to shift it to the right, it suffices to tilt the remote control device 16 by leaning it about the axis 30 to the right, and so on. The commands c) and d), as for them, result from actions applied by contact of the user's finger 20 with corresponding specific areas of the touch screen 18. [0012] The drone also has an automatic and autonomous hover stabilization system, activated in particular when the user removes his finger from the touch screen of the aircraft, or automatically at the end of the takeoff phase, or in case of interruption of the radio link between the aircraft and the drone. [0013] Figure 2a schematically illustrates, in profile, the attitude of the drone when it is stationary, in a state of levitation. The field covered by a front camera 14 of conventional type has been schematized at 36, for example a camera covering a field of 54 ° and whose axis of vision 8 is centered on the horizon. [0014] If, as shown in FIG. 2b, the drone progresses forward with a nonzero horizontal speed, by design the axis 26 of the drone will be tilted forward by an angle θ (pitch angle) relative to vertically V. This forward inclination, schematized by the arrow 38, implies an inclination of the same value, shown schematically by the arrow 40, of the axis 6 of the camera relative to the plane of the horizon HZ. We understand as well as the developments of the drone, its acceleration, slowdowns, etc. the ô axis oscillates permanently around the direction of the horizon HZ, which will be reflected in the image by permanent oscillation movements up and down. [0015] 3032052 10 Similarly, if the drone shifts to the right or to the left, this movement will be accompanied by a pivot about the roll axis 24, which will be reflected in the image by one-way rotations or in the other of the scene captured by the camera. [0016] In order to overcome this drawback, it has been proposed, as is stated in the above-mentioned application FR 14 56302, to equip the camera with a fisheye-type hemispherical field lens covering a field of about 180.degree. Figure 2a. The image captured by the camera equipped with this fisheye lens will certainly undergo the same movements of oscillation and rotation as a conventional camera but only a part of the field captured by this camera will be used by selecting a window particular, hereinafter "image area", corresponding to the angular sector 36 captured by a conventional camera, and which will be dynamically moved in the hemispherical image in the opposite direction to the movements of the drone as determined by the inertial unit, so to annihilate the oscillations that otherwise would be observed on the image. Thus, in the case illustrated in FIG. 2b, where the drone plunges downward from a pitch angle cp (arrow 38) with respect to the vertical V, the capture window will be moved upwards (arrow 44) by an angle of the same value, thereby reducing in the direction of the horizon HZ the central axis of the sector 36 of a "virtual camera" covering the field corresponding to the image zone. As illustrated in the figures, to the extent that the forward movements of the drone are more frequent than those towards the rear and that, on the other hand, the areas of interest (terrain overflown) are located below the level of the drone above it, it may be advantageous to tilt down the main axis A of the fisheye lens (for example, an elevation angle of - 20 °), so as to cover a greater number of configurations of evolution of the drone and to ensure that the sector 36 corresponding to the capture zone of the "virtual camera" always remains in the field 42 of the fisheye lens. Figure 3 shows in (a) an example of a scene, as detected on the sensor of the video camera equipped with the fisheye lens. As can be seen, the image I of this scene has very strong geometrical distortions, inherent in the hemispherical coverage 3032052 11 or quasi-hemispherical of the fisheye lens rectified on the flat surface of the sensor. Only part of the image I produced by the fisheye lens is used. This part is determined according to i) the direction in which the "virtual camera" is pointed, ii) the field of view thereof (schematized at 36 in FIGS. 2a and 2b) and iii) its ratio width / height. Thus, a "capture zone" ZC containing raw pixel data including the "useful zone" ZU corresponding to the field of the virtual camera after compensation for geometric distortions introduced by the fisheye lens is defined. Note that it is not useful to capture all the pixels of the image I formed on the sensor, but only a fraction thereof, corresponding to the capture area ZC. Thus, if it is desired to obtain, for example, an image in HD quality (1920 x 1080 pixels, ie 2 Mpixel for the useful area ZU), it is necessary to have at the outset a fisheye image of very high resolution in order to be able to extract a good quality HD view regardless of the direction in which the virtual camera, which implies to have a sensor whose resolution will typically be 14 Mpixel (4608 x 3288 pixels). Under these conditions, if the entire image I were transferred for processing, this would correspond to a 14 Mpixel pixel data stream for each image, leading to a frame rate (framerate) of the order 6 frames per second (ips) at this resolution, which would be insufficient for a smooth video sequence (imposing a framerate close to 30 fps). Therefore, only the pixel data of the capture zone ZC actually needed is transferred, for example a capture window ZC of about 2 Mpixel, which can then be refreshed at a rate of 30 fps without any particular difficulty. It is thus possible to choose a high resolution sensor while maintaining a high image rate. [0017] The views (a) - (d) of FIG. 3 illustrate the different processing operations performed on the pixel data of the capture area ZC to arrive at the compensated final image of the geometric distortions. From the pixel data transferred from the capture area ZC (view (b)) the process extracts the pixel data from the gross useful area ZUB (view (c)) and applies a mesh of triangles thereto (technique in it - 3032052 12), which will straighten the image by stretching each triangle to give a ZUR rectified image (view (d)) with rectified pixel data. In particular, the strongly curved horizontal lines of the fisheye image will be corrected to make them flat and produce an image corresponding to a natural vision, devoid of geometric distortions. The present invention aims to remedy a particular disadvantage that appears during the extraction of the image zone ZI from the capture zone ZC, when the roll angle of the drone (rotation around the longitudinal axis) 24) is important, typically when it exceeds values of the order of 15 °. This can occur in particular when abrupt lateral translation to the right or to the left of the drone, or very tight turn in one direction or the other (the inclination according to a roll angle then combining with an inclination according to the pitch angle, the latter being all the higher as the horizontal speed of the drone is important). This phenomenon is illustrated in FIGS. 4a and 4b: when the attitude of the drone is horizontal (attitude corresponding to a hover), the image collected on the surface of the sensor is that shown in FIG. 4a, with a rectangular capture area ZC centered on the hemispherical image produced by the fisheye lens, and a smaller rectangular image area ZI located inside the capture area ZC and centered in the same way. In the case of a change of attitude of the drone with rotation of the drone left or right around its roll axis, for example a rotation to the right of an angle 0 as shown in Figure 4b, the formed image on the sensor undergoes a corresponding rotation. But for roll angle values exceeding a certain threshold, the image zone ZI can no longer be completely inscribed within the capture zone ZC, and two opposite corners ZX of this image zone ZI will overflow from the zone Z1. ZC capture. In these zones ZX no pixel data will be delivered (since the pixels of the sensor which will be extracted and delivered at the exit of the camera are restricted to the capture zone ZC), which will be translated on the final image by two gray zones. in two corresponding opposite corners. Even if these gray areas do not appear very fleetingly during the evolution of the drone, they nevertheless constitute an annoying artifact, which will remain visible on the sequence of images intended to be recorded and reproduced later. The same phenomenon appears if the drone is simply used at stop 5 and held by hand to be used as a simple camera: if the drone (the camera) is too inclined around its longitudinal axis, the artifact of the gray areas ZX will be present as long as this excessive inclination lasts. To overcome this drawback, the invention proposes to switch dynamically between different operating configurations of the sensor. By "operating configuration" is meant, as indicated in the introduction, a set of camera capture parameters including: - the output resolution, that is to say the size expressed in pixels of the image transmitted as output by the camera (ZI image area); The capture resolution, that is to say the size expressed in pixels of the region acquired on the surface of the sensor (capture zone ZC); and the capture mode, that is to say the processing that makes it possible to go from the capture resolution to the output resolution. For this last parameter, digital cameras generally offer several video data acquisition modes, the most common of which are: the "normal" mode, in which all the pixels acquired within a given window are transmitted as is output from the sensor. The capture resolution is therefore identical to the output resolution and the quality of the image is maximum; - The so-called "binning" mode, in which neighboring pixels are electrically grouped to form macropixels, and it is these macropixels that are output as pixel data. It thus becomes possible, in a comparable acquisition time (thus without impact on the fluidity of the video sequence) to capture a larger area, but at the cost of a significant loss of definition, of at least 50%. . Indeed, the pixels are grouped at least two by two horizontally and / or vertically to form the macropixel by averaging individual pixels; 3032052 14 - the so-called "scaling" mode, which makes it possible to adjust the reduction level more finely than the binning mode. In fact, while binning is limited to type 2 ratios since it acts by grouping the pixels, the scaling operates a software reduction with filtering of each individual pixel according to the value of the neighboring pixels. 3/4, 2/3, etc., with a higher quality image than in binning mode, but this mode is slower because to reduce the image Necessary, and all the pixels of the sensed surface must be acquired to perform the filtering, Figure 5 illustrates, in block diagram form, the various elements of an image processing circuit for controlling, according to the invention, the dynamic switching of the operating configurations. [0018] The camera 14 of the drone is subject to angular displacement. These displacements are measured by an IMU 50 inertial unit delivering signals representative of the instantaneous rotations of the drone and thus of the camera, the inertial unit 50 and the camera 14 being mechanically integral. These rotations are given by pitch angles θ, roll 0 and y-yaw, which describe the inclination of the drone in the three dimensions with respect to a fixed landmark (Euler angles). These rotational measurements are applied to a video processing module 52 providing windowing, image stabilization and artifact correction of the raw image signal delivered by the camera 14. The module 52 outputs a corrected image signal and stabilized which can then be transmitted to the user for viewing on a remote screen, recording in a digital memory, etc. The windowing carried out by the module 52 is controlled by a module 54 ensuring the calculation of the position of the capture window ZC in the hemispherical image formed on the digital capacitor, based on predictions of the angles of rotation cp, 0 and delivered therein by a predictor module 56 receiving as input the instantaneous values of the rotations of the drone delivered by the inertial unit 50. These data are delivered at the acquisition frequency of the gyrometers, typically a frequency of 990 Hz, much greater than the frequency of acquisition of the images by the video camera, typically a frequency of 30 Hz. Typically of the invention, the module 54 also includes a module 58 for selecting the best configuration of the sensor, in function - 5 tion of the angle of roll predicted by the module 56. Figure 6 schematically illustrates the manner in which this selection is made as a function of the roll angle of the drone, between two different possible configurations, for example: a "normal mode" configuration, when the roll angle θ is less than 15 °. The selected configuration ensures maximum quality, for example with an output and capture resolution of 1408 x 2112 pixels; and a configuration in degraded mode, for example a configuration in "horizontal binning" mode if the roll angle θ exceeds the threshold (for example a threshold of θ = 15 °) which does not make it possible to generate a complete image area (as discussed above with reference to Figures 4a and 4b). The gradient mode configuration provides for example a capture resolution of 2816 x 2112 pixels to achieve an output resolution of 1408 x 2112 pixels. [0019] If the threshold is crossed, the module 58 then sends the camera 14 a sensor reconfiguration signal causing it to operate with a lower quality, but a larger capture area. This degraded quality configuration in binning mode is maintained as the roll angle θ exceeds the aforementioned threshold. Preferably, to avoid multiple switching phenomena around the threshold, it is intended to introduce a hysteresis, only triggering the return to the normal mode configuration if the roll angle drops for example below 12 ° (whereas the reverse tilting was controlled when this angle exceeded 15 °). [0020] FIG. 7 is a counterpart of FIG. 6 for three different possible configurations, including for example a medial configuration in scaling mode with a moderately degraded quality, intermediate between the maximum quality of the normal mode and the more highly degraded quality of the mode. binning. [0021] 3032052 16 Switching between the three modes is performed in the same way as before, for example based on roll angle ranges 0-15 ° for normal mode, 15-25 ° for scaling mode and 25-45 ° for the binning mode, with in each case introducing a hysteresis to the switching from one configuration to another. Figures 8a to 8e illustrate an example of switching between two different configurations, maximum and degraded, in a scenario where the drone is first flat, leans slightly and then more pronounced, before returning to flat. [0022] In these figures, the image area ZI is indicated in dotted lines and the dashed capture area. This capture area can be a ZCB basic capture area when the camera is operating in normal mode, or an extended capture area ZCE when the camera is operating in degraded mode, for example in 2 x 2 binning mode with extended capture surface ZCE four times larger than the basic capture area ZCB. Initially (Figure 8a) the drone is horizontal, which typically corresponds to a hovering situation. The roll angle θ is 0 ° and the axes D of the basic capture zone and B 'of the image zone ZI coincide. [0023] When the drone leans slightly, with a rolling angle θ less than 15 °, for example a roll angle θ = 12 ° (FIG. 8b), the image zone ZI remains entirely inscribed in the capture zone of FIG. ZCB base. When the drone leans more steeply (FIG. 8c), for example with a roll angle θ = 30 °, the base capture area ZCB 25 can no longer contain the image zone ZI, which overflows in the nonzero zones. captured ZX. The sensor then switches to a low-definition configuration (binning or scaling) with an extended capture zone ZCE that can always contain the image zone ZI even in the most extreme configurations (up to 0 = 45 °). [0024] When the drone performs the reverse maneuver, the configuration is maintained until reaching a roll angle of 0 = 12 ° (Figure 8d). Note that in Figure 8d the inclination is the same as in Figure 8b, but that the capture area remains the extended capture area ZCE (hysteresis effect). Indeed, if one returned too early to the normal mode, the margin of error would be too small to be certain that the surface of the capture area will always be sufficient to contain the image area. Only when this margin of error is large enough (Figure 8e) can we then operate the reverse switch of the sensor 5 in normal mode, with maximum quality. Thus, in all cases, even when the roll becomes large, the stabilized view is always generated without missing any pixels in the image. When the binning mode is used (situation of Figures 8c and 8d), the image will become slightly fuzzier, but this modification, very fleeting, will be almost imperceptible, much less in any case a loss of pixels that would translate by gray bands or corners in the final image delivered to the user.
权利要求:
Claims (3) [0001] REVENDICATIONS1. A motorized flying machine of the drone type (10), comprising: a camera (14) connected to the body of the drone, with a lens and a digital sensor collecting the image formed by the lens; an inertial unit (50) capable of measuring the angles of Euler (yo, 6), tif) describing the instantaneous attitude of the drone with respect to an absolute terrestrial marker; an extractor module (52) receiving, as input, pixel data of a capture area (ZC) on the sensor and outputting pixel data of a reduced image area (ZI) defined at the input; within the catchment area; and a servo-control module (54, 56), receiving as input values of angles delivered by the inertial unit, and able to dynamically modify the position and the orientation of the image zone inside the zone of capture, in a direction opposite to that of the changes of the values of the angles measured by the inertial unit, characterized in that: the sensor is a sensor that can operate according to a plurality of different operating configurations, dynamically selectable during the same the camera's shooting sequence, said operating patterns comprising: a basic configuration using a basic capture area (ZCB) as a capture area (ZC), and at least one degraded mode configuration using as a capture area (ZC) an extended capture area (ECA) larger than the basic capture area (ZCB), with application to the pixels of the extended capture area of a con version adapted to reduce the dimensions of the extended capture area to those of the basic capture area, and - there is further provided a compensator module (58), receiving as input successive values of roll angle (e) of the drone delivered by the inertial unit, and adapted to dynamically modify the current operating configuration of the sensor so that the sensor operates: 3032052 19 according to the basic configuration for values of roll angle (0) less than a first limit, and in a degraded mode configuration for roll angle values (0) greater than a second limit. 5 [0002] 2. The drone of claim 1, wherein the roll effect compensating means is adapted to dynamically modify the current operating configuration of the sensor: from the basic configuration to the degraded mode configuration when the value of the roll angle increases beyond a first predetermined threshold, and from the degraded mode configuration to the base configuration when the roll angle of the drone decreases below a second predetermined threshold, the second threshold being different of the first threshold and below it. [0003] The drone of claim 1, wherein the operating configurations comprise: the basic configuration (Configuration # 1), using the base capture area (ZCB) as a capture area (ZC), a mode configuration moderately degraded (Configuration # 2) using as a capture area (ZC) a first extended capture area larger than the basic capture area, with application to the pixels of the first extended capture area of a first capture treatment. conversion capable of reducing the dimensions of the first extended capture area to those of the basic capture area, and a heavily degraded mode configuration (Configuration # 3) using as a capture area (CZ) a second extended capture area. Greater than the first extended capture area, with application to the pixels of the second extended capture zone of a second conversion treatment, different from the first treatment, suitable for reducing the size of the second capture area extended to that of the base capture area, and wherein the compensator module is further adapted to dynamically change the current operating configuration of the sensor such that the sensor operates: according to the basic configuration for roll angle values lower than the first limit, according to the moderately degraded mode configuration for roll angle values greater than the second limit and less than a third limit, and according to the strongly degraded mode configuration for roll angle values greater than the third limit.
类似技术:
公开号 | 公开日 | 专利标题 EP3048789B1|2016-12-28|Drone provided with a video camera and means to compensate for the artefacts produced at the greatest roll angles EP3142353B1|2019-12-18|Drone with forward-looking camera in which the control parameters, especially autoexposure, are made independent of the attitude EP2933775B1|2016-12-28|Rotary-wing drone provided with a video camera supplying stabilised image sequences EP3142354B1|2018-03-14|Drone with forward-looking camera with segmentation of the image of the sky in order to control the autoexposure EP3025770B1|2017-01-25|Video system for piloting a drone in immersive mode EP3078402B1|2017-10-04|System for piloting an fpv drone EP3086195B1|2019-02-20|System for piloting a drone in first-person view mode EP3171589B1|2018-06-06|Drone equipped with a video camera sending sequences of images corrected for the wobble effect EP2613214B1|2017-08-02|Method for controlling a rotary-wing drone to operate photography by an on-board camera with minimisation of interfering movements EP3076258B1|2018-08-01|Drone provided with a video camera with compensated vertical focussing of instantaneous rotations for estimating horizontal speeds FR3027143A1|2016-04-15|MOBILE APPARATUS, IN PARTICULAR ROTATING SAIL DRONE, PROVIDED WITH A VIDEO CAMERA DELIVERING IMAGE SEQUENCES DYNAMICALLY CORRECTED BY THE "WOBBLE" EFFECT FR2972364A1|2012-09-14|METHOD FOR CONTROLLING FOLLOWING A CURVED TURNING OF A MULTI - ROTOR ROTOR SAILING DRONE. EP3273317A1|2018-01-24|Autonomous system for taking moving images, comprising a drone and a ground station, and associated method FR3020168A1|2015-10-23|ROTATING WING DRONE WITH VIDEO CAMERA DELIVERING STABILIZED IMAGE SEQUENCES FR3052678A1|2017-12-22|DRONE PROVIDED WITH A FRONTAL VIDEO CAMERA COMPRESSING THE INSTANTANEOUS ROTATIONS OF THE DRONE AND CORRECTION OF THE ARTIFACTS FR3082012A1|2019-12-06|ELECTRONIC DEVICE, AND METHOD, FOR CONTROLLING A DRONE, COMPUTER PROGRAM FR3079943A1|2019-10-11|ELECTRONIC DEVICE AND METHOD FOR CONTROLLING A DRONE WITH TRAVELING COMPENSATION EFFECT, ASSOCIATED COMPUTER PROGRAM FR2968499A1|2012-06-08|METHOD OF USING IMAGE SENSOR
同族专利:
公开号 | 公开日 EP3048789B1|2016-12-28| US9876959B2|2018-01-23| US20180109731A1|2018-04-19| CN105915786A|2016-08-31| EP3048789A1|2016-07-27| JP2016140069A|2016-08-04| FR3032052B1|2017-03-10| US10326939B2|2019-06-18| US20160219223A1|2016-07-28|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 EP2031864A1|2006-05-12|2009-03-04|Opt Corporation|Dynamic image display method, dynamic image display system, and wide-angle dynamic image capturing device| US20110128350A1|2009-11-30|2011-06-02|Motorola, Inc.|Method and apparatus for choosing a desired field of view from a wide-angle image or video| EP2613214A1|2012-01-05|2013-07-10|Parrot|Method for controlling a rotary-wing drone to operate photography by an on-board camera with minimisation of interfering movements| US6672538B2|2002-05-23|2004-01-06|Sikorsky Aircraft Corporation|Transmission for a coaxial counter rotating rotor system| FR2938774A1|2008-11-27|2010-05-28|Parrot|DEVICE FOR CONTROLLING A DRONE| FR2957266B1|2010-03-11|2012-04-20|Parrot|METHOD AND APPARATUS FOR REMOTE CONTROL OF A DRONE, IN PARTICULAR A ROTATING SAIL DRONE.| FR2961601B1|2010-06-22|2012-07-27|Parrot|METHOD FOR EVALUATING THE HORIZONTAL SPEED OF A DRONE, IN PARTICULAR A DRONE SUITABLE FOR AUTOPILOT STATIONARY FLIGHT| CA2848223A1|2011-09-09|2013-03-14|SZ DJI Technology Co., Ltd|Dual-axis platform for use in a small unmanned aerial vehicle and tri-axis platform for use in a small unmanned aerial vehicle| CN102566441A|2011-12-29|2012-07-11|成都飞机工业(集团)有限责任公司|Visual simulation test system for unmanned aerial vehicle | FR2985329B1|2012-01-04|2015-01-30|Parrot|METHOD FOR INTUITIVE CONTROL OF A DRONE USING A REMOTE CONTROL APPARATUS| EP2880860A4|2012-08-01|2016-06-08|Bye Uas Inc|Small uas with high definition video| US9070289B2|2013-05-10|2015-06-30|Palo Alto Research Incorporated|System and method for detecting, tracking and estimating the speed of vehicles from a mobile platform| US8903568B1|2013-07-31|2014-12-02|SZ DJI Technology Co., Ltd|Remote control method and terminal| FR3020169A1|2014-04-16|2015-10-23|Parrot|ROTATING WING DRONE WITH VIDEO CAMERA DELIVERING STABILIZED IMAGE SEQUENCES| JP6146940B2|2014-04-28|2017-06-14|エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd|platform| KR102219698B1|2014-04-30|2021-02-24|엘지전자 주식회사|Apparatus and method for controlling unmanned aerial vehicle| US9586683B1|2014-12-22|2017-03-07|Amazon Technologies, Inc.|Transitioning an unmanned aerial vehicle to horizontal flight| FR3032052B1|2015-01-26|2017-03-10|Parrot|DRONE EQUIPPED WITH A VIDEO CAMERA AND MEANS FOR COMPENSATING THE ARTIFACTS PRODUCED AT THE MOST IMPORTANT ROLL ANGLES|FR3032052B1|2015-01-26|2017-03-10|Parrot|DRONE EQUIPPED WITH A VIDEO CAMERA AND MEANS FOR COMPENSATING THE ARTIFACTS PRODUCED AT THE MOST IMPORTANT ROLL ANGLES| FR3055075B1|2016-08-09|2021-09-17|Vincent Boucher|4FT STERADIAN STABILIZED AND OMNIDIRECTIONAL SHOOTING DEVICE ALLOWING TO OBTAIN A STILL IMAGE WHEN THE DEVICE IS IN MOTION| FR3055078B1|2016-08-11|2019-05-31|Parrot Drones|IMAGE CAPTURE METHOD , COMPUTER PROGRAM, AND ELECTRONIC CAPTURE SYSTEM OF ASSOCIATED VIDEO| FR3055077B1|2016-08-11|2018-10-05|Parrot Drones|METHOD OF CAPTURING VIDEO, COMPUTER PROGRAM, AND ELECTRONIC CAPTURE SYSTEM OF ASSOCIATED VIDEO| BR112019005759A2|2016-09-23|2019-06-11|Qualcomm Inc|image processing on an unmanned autonomous vehicle| WO2018053786A1|2016-09-23|2018-03-29|Qualcomm Incorporated|Adaptive motion filtering in an unmanned autonomous vehicle| CN106375669B|2016-09-30|2019-08-06|天津远度科技有限公司|A kind of digital image stabilization method, device and unmanned plane| KR20180054144A|2016-11-15|2018-05-24|삼성전자주식회사|Electronic device and method for controlling moving device using the same| KR20180068411A|2016-12-14|2018-06-22|삼성전자주식회사|Controlling method for operation of unmanned vehicle and electronic device supporting the same| CN107000833B|2016-12-26|2019-01-22|深圳市大疆创新科技有限公司|Unmanned plane| US20180231379A1|2017-02-14|2018-08-16|Honeywell International Inc.|Image processing system| CN108521805A|2017-03-28|2018-09-11|深圳市大疆灵眸科技有限公司|The control method and aircraft of installation condition based on holder| CN107632974B|2017-08-08|2021-04-13|北京微瑞思创信息科技股份有限公司|Chinese analysis platform suitable for multiple fields| KR101951666B1|2017-10-05|2019-02-25|윤태기|Drone for taking pictures and controlling method thereof| EP3550506B1|2018-04-05|2021-05-12|Everdrone AB|A method for improving the interpretation of the surroundings of a uav, and a uav system| US10917573B1|2019-05-21|2021-02-09|Gopro, Inc.|Systems and methods for stabilizing videos| WO2020246970A1|2019-06-05|2020-12-10|Hewlett-Packard Development Company, L.P.|Structural measurement using a fixed pattern|
法律状态:
2016-01-26| PLFP| Fee payment|Year of fee payment: 2 | 2016-07-29| PLSC| Search report ready|Effective date: 20160729 | 2017-01-20| PLFP| Fee payment|Year of fee payment: 3 | 2017-07-21| TP| Transmission of property|Owner name: PARROT DRONES, FR Effective date: 20170616 | 2018-11-02| ST| Notification of lapse|Effective date: 20180928 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1550568A|FR3032052B1|2015-01-26|2015-01-26|DRONE EQUIPPED WITH A VIDEO CAMERA AND MEANS FOR COMPENSATING THE ARTIFACTS PRODUCED AT THE MOST IMPORTANT ROLL ANGLES|FR1550568A| FR3032052B1|2015-01-26|2015-01-26|DRONE EQUIPPED WITH A VIDEO CAMERA AND MEANS FOR COMPENSATING THE ARTIFACTS PRODUCED AT THE MOST IMPORTANT ROLL ANGLES| EP16152385.7A| EP3048789B1|2015-01-26|2016-01-22|Drone provided with a video camera and means to compensate for the artefacts produced at the greatest roll angles| US15/004,733| US9876959B2|2015-01-26|2016-01-22|Drone provided with a video camera and means for compensating for the artefacts produced at the highest roll angles| JP2016011421A| JP2016140069A|2015-01-26|2016-01-25|Drone including video camera and artifact compensation means generated at maximum roll angle| CN201610255068.3A| CN105915786A|2015-01-26|2016-01-26|Drone provided with a video camera and means to compensate for the artefacts produced at the greatest roll angles| US15/842,710| US10326939B2|2015-01-26|2017-12-14|Drone provided with a video camera and means for compensating for the artefacts produced at the highest roll angles| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|