![]() DRONE EQUIPPED WITH A VERTICAL VIEWED VIDEO CAMERA COMPENSATED WITH INSTANTANEOUS ROTATIONS FOR ESTI
专利摘要:
A vertical aiming camera (16) delivers an image signal (ScamV) of the terrain overflown by the drone. Gyrometric sensors (102) measure the Euler angles (φ, θ, ψ) characterizing the attitude of the drone and deliver a gyrometric signal (Sgyro) representative of the instantaneous rotations. Rotation compensating means (136) receives the image signal and the gyrometric signal and outputs rotated compensated image data rotations, which are then used to estimate the horizontal velocities of the drone. The camera and the inertial unit are driven by a common clock (160), and there is provided a circuit (170) for determining the value of the phase shift between the gyrometric signal and the image signal, and applying this input phase shift value. rotational compensation means the rotation compensation means (136) for resynchronizing the image signal on the gyro signal before calculating the image data being recalibrated. 公开号:FR3034554A1 申请号:FR1552813 申请日:2015-04-01 公开日:2016-10-07 发明作者: 申请人:Parrot SA; IPC主号:
专利说明:
[0001] The invention relates to the control of motorized flying machines such as drones, especially rotary-wing drones such as quadrocopters. The AR.Drone 2.0 or the Bebop Drone of Parrot SA, Paris, France are typical examples of such quadricopters. They are equipped with a series of sensors (accelerometers, 3-axis gyrometers, altimeters), a frontal camera capturing an image of the scene towards which the drone is directed and a vertical aiming camera capturing an image of the terrain overflown. . They are provided with multiple rotors driven by respective engines controllable in a differentiated manner to control the drone attitude and speed. Various aspects of these drones are described, inter alia, in EP 2 364 757 A1, EP 2 400 460 A1, EP 2 613 213 A1 or EP 2 644 240 A1 EP 2 613 214 A1 (Parrot SA). The invention relates more precisely to the measurement of the horizontal velocity of the drone, which essentially relies on the use of the image supplied by the camera with vertical aim, by means of a calculator which analyzes the sequence of the images captured by this camera . The estimated displacement of the camera gives a measure of the horizontal translation of the drone in both directions, independently of the inertial sensors (accelerometers). More precisely, two consecutive images given by the camera are analyzed to give an apparent displacement, in pixels, in one and the other horizontal dimension of an absolute terrestrial reference. If we know the altitude (given by the ultrasonic and barometric sensors embedded by the drone), it is possible to convert this apparent displacement of the images into a distance, therefore in a speed knowing the interval of time separating two successive images. . The above-mentioned EP 2,400,460 A1 describes such a technique for measuring the horizontal speed based on an algorithm of the so-called "optical flow" type, with possible switching, dynamically controlled, to another "detector-type" algo- rithm. points "or" point of interest detection "when the optical flow algorithm is no longer able to deliver a sufficiently reliable result. The optical flow method has many advantages, in particular to impose very few constraints on the contrast and the content of the scene, and to be able to estimate high speeds as low speeds by a "type" approach. multi-resolution ". On the other hand, this method is very sensitive to rotations and changes of attitude, and it does not allow to verify intrinsically the quality of the result delivered, ie the algorithm always delivers a result when enough points have a high gradient, but this result is delivered even if it does not make sense. In fact, the result delivered is not always reliable, nor very precise especially for low speeds. However, if it is desired to achieve an effective and reactive automatic stabilization servocontrol in hovering, it is necessary to have a measurement of horizontal speed that is both precise and sensitive (because the linear velocities around the equilibrium point can be very low) and available in real time. These rotations and attitude changes that may affect the measurement of the horizontal speed of the drone can be measured in real time by the inertial unit. The implementation of an optical flow algorithm also includes, in a manner known per se, a step of compensating rotations between the images successively used. For this purpose, the errors which may be introduced (rotations which add a component along the intersection between the image plane and the plane perpendicular to the axis of rotation) are compensated by adding to the estimated optical flow the displacement of the optical center due to the rotation between two successive images. In other words, in case of rotation between two successive images of the scene captured by the camera, a new version of the image is generated before being compared with the previous image, by a projection matrix calculation. executed in real time by the software within the signal processor of the drone. This rotation compensation makes it possible to make the speed calculation robust to the rotations encountered, in particular when hovering, and to maintain a linear model for moving the drone. A difficulty in the implementation of this rotation compensation technique lies in the fact that errors in the values of the angles of rotation (in particular of roll and pitch) delivered to the algorithm by the inertial unit produce the same consequences as a translation of the image, and that even minimal errors are likely to generate biases in the estimation of the components of the horizontal velocity - even though the rotation compensation algorithm perfectly recalibrates an image on the image. the other. These minimal errors may notably arise from a defect in synchronism between the image signal delivered by the vertical camera and the gyrometric signal delivered by the inertial unit. If we want to avoid any error in the process of compensation of rotations, it is important to ensure for each image an extremely precise synchronization of these two signals. [0002] These synchronization errors can in particular result from a certain latency in the software processing of the signals, which latency is not constant and varies unpredictably, particularly as a function of the processor's workload at a given instant. The object of the present invention is to remedy these drawbacks, by proposing a solution making it possible to guarantee at all times an extremely precise synchronization between the image signal delivered by the vertical camera and the gyrometric signal delivered by the inertial unit. of the drone. This is to avoid any introduction of bias in the correction of rotations between successive images, with the important consequences that have been explained above with respect to the accuracy of the estimation of the horizontal velocity components of the drone. For this purpose, the invention proposes a drone comprising, in a manner known in itself: a video camera with a vertical aim, able to point to a scene of the terrain overflown by the drone to capture a sequence of images of this scene and outputting a corresponding digital image signal; an inertial unit comprising gyrometric sensors capable of measuring the Euler angles characterizing the attitude of the drone with respect to an absolute terrestrial reference and outputting a gyrometric signal representative of the instantaneous rotations of the drone; rotation compensation means, receiving at input i) the image signal delivered by the camera and ii) the gyrometric signal delivered by the inertial unit, and outputting compensated image data; rotations of the drone from one image to the next; and means for estimating the horizontal speeds of the drone from the movements of one image to the next of the scene picked up by the camera, these estimating means receiving as input said recalibrated image data. Characteristically according to the invention: the camera and the inertial unit are controlled by a common clock; the acquisition frequency of the gyro sensors is a multiple of the frequency of the delivery of the image signal of the camera; and - there is further provided a circuit adapted to: compare the gyrometric signal delivered by the inertial unit with the image signal delivered by the camera; Determining the value of the phase shift between these respective signals; apply this phase shift value at the input of the rotation compensation means; and the means for compensating the rotations are furthermore capable of performing, before calculating said rescaled image data, a preliminary correction of resynchronization of the image signal delivered by the camera on the gyrometric signal delivered by the inertial unit. . Very preferably, the circuit able to compare the gyrometric signal delivered by the inertial unit with the image signal delivered by the camera, to determine the value of the phase shift between these respective signals, and to apply this phase shift value at the input of the means. of compensation of rotations, is a circuit in material technology. Preferably, the drone further comprises integrating means capable of integrating the gyrometric signal delivered by the inertial unit over the time interval between the delivery of image signals corresponding to two consecutive images of the sequence of FIG. images. [0003] An exemplary embodiment of the present invention will now be described with reference to the accompanying drawings in which like reference numerals 3034554 designate identical or functionally similar elements from one figure to another. Figure 1 is a general view showing a drone controlled by a remote control device. [0004] Figure 2 is a block diagram of the various servo control and steering controls of the drone. Figure 3 illustrates, in block diagram form, the various elements contributing to the synchronization mechanism between the video camera and the gyrometers. [0005] Figure 4 illustrates various timing diagrams showing the temporal relationship between the signals delivered by the camera and the gyrometers. [0006] An embodiment of the invention will now be described. In Figure 1, the reference 10 generally denotes a drone, which is for example a quadricopter such as the Bebop Drone model of Parrot SA, Paris, France. This drone comprises four coplanar rotors 12 whose engines are controlled independently by an integrated system of navigation and attitude control. It is provided with a front-facing camera 14 which makes it possible to obtain an image of the scene towards which the drone is heading, for example a wide-angle high definition CMOS sensor camera with a resolution of 1920 x 1080 pixels with a refreshing frequency. video stream of 30 fps (image per second). The drone is also provided with a camera 16 with a vertical aim pointing downwards, able to capture successive images of terrain overflown T and used in particular to evaluate the speed of the drone relative to the ground. This is for example a QVGA resolution CMOS sensor camera (320 x 240 pixels) with a video stream refresh rate of 60 fps and a 64 ° field of view. The drone is also equipped with inertial sensors (accelerometers and gyrometers) to measure with some accuracy the angular velocities and attitude angles of the drone, that is to say the angles of Euler (pitch that, roll 0 and lace ie) describing the inclination of the drone with respect to a horizontal plane of a fixed terrestrial reference UVVV, it being understood that the two longitudinal and transverse components of the horizontal velocity are intimately related to the inclination according to the two respective axes of pitch and roll. The drone 10 is controlled by a remote remote control device 18 such as a multimedia touch-screen phone or tablet and integrated accelerometers, for example a smartphone of the iPhone (registered trademark) or other type, or an iPad-type tablet (brand name). filed) or other. This is a standard device, unmodified if not the loading of a specific application software such as the mobile application AR Free Right (trademark 10 filed) to control the piloting of the drone 10 and the visualization of The camera 18 comprises a body 20 and a touch screen 22 displaying the image of the scene captured by the front camera 14, with a number of symbols superimposed enabling the activation of commands. control (rise / fall, etc.) by simply touching a finger 24 of the user on the symbols displayed on the screen. The apparatus 18 is also provided with inclination sensors making it possible to control the attitude of the drone by printing the apparatus with corresponding inclinations along roll and pitch axes to advance or retract it. The actions of the user are interpreted by the specific application software, which transforms them into command signals for the drone. FIG. 2 is a block diagram of the various control, servocontrol and control components of the drone, as well as correction of the movements of the image according to the technique of the invention (it will be noted that, although these diagrams are presented in the form of interconnected circuits, the implementation of the various functions is essentially software, this representation having only an illustrative character). In general, the control system involves several nested loops for the control of the horizontal speed, the angular speed of the attitude of the drone and altitude variations, automatically or under control of the user. The most central loop is the angular velocity control loop 100, which uses, on the one hand, the signals supplied by gyrometers 102 and, on the other hand, a reference constituted by angular velocity instructions 104. This information are applied at the input of a stage 106 of angular speed correction, which itself drives a motor control stage 108 in order to separately control the speed of the various motors to correct the angular velocity of the drone by the combined action of the rotors driven by these engines. [0007] The angular velocity control loop 100 is embedded in an attitude control loop 112, which operates from the indications provided by the gyros 102 and by accelerometers 114. The data from these sensors are applied to a stage. 118 which produces an estimate of the real attitude of the drone, applied to an attitude correction stage 120. This stage 120 compares the real attitude of the drone with angle commands generated by a circuit 122 from commands directly applied by the user 124 and / or from data generated internally by the autopilot of the drone via the circuit 126 for horizontal speed correction. The possibly corrected instructions 15 applied to the circuit 120 and compared to the actual attitude of the drone are transmitted by the circuit 120 to the circuit 104 to control the motors appropriately. A horizontal speed control loop 130 includes the vertical video camera 16, an ultrasonic telemetry sensor 132 and a barometric sensor 134 acting as an altimeter. A circuit 136 processes the images produced by the vertical camera 16, in combination with the signals of the accelerometer 114 and the attitude estimation circuit 118, to produce data for obtaining data by means of a circuit 138 an estimate of the horizontal velocities Vx and Vy according to the two axes of pitch and roll of the drone. The estimated horizontal velocities Vx and Vy are corrected by the vertical velocity estimate Vz given by a circuit 140 and by an estimate of the value of the altitude z given by the circuit 142 from the information of the sensors 132 and 134. [0008] For the control of the vertical displacements of the drone, the user 124 applies commands to an altitude setpoint calculation circuit 144, a setpoint which is applied to an ascending speed instruction calculation circuit Vz 146 via the correction circuit the ascending velocity Vz thus calculated is applied to a circuit 150 which compares it to the corresponding velocity estimated by the circuit 140, and modifies in accordance with the present invention. Accordingly, the control data of the motors (circuit 108) increases or decreases the speed of rotation simultaneously on all the motors so as to minimize the difference between the ascending velocity of the sign and the measured rate of climb. With regard to the front video camera 14, the latter delivers raw video data (pixel data) to a windowing circuit 152 ensuring the selection of the useful pixels in a capture zone determined by the inertial unit 154 (including the gyrometers 102, accelerometers 114 and attitude estimation circuit 118). The video data recorded in the capture zone are applied to a geometric distortion correction circuit 156 which produces rectified video data, themselves delivered to a transmitter circuit 158 ensuring the transmission of the video image to the remote remote control device held. by the user. The various aspects relating to the synchronization of the gyrometric signals and the image signals of the vertical camera, for calculating the horizontal speeds estimated by the circuit 138 will now be described in more detail with reference to FIG. [0009] The gyrometers 102, the front camera (horizontal camera) 14 and the vertical aim camera 16 are driven by a common clock circuit 160, the respective operating frequencies of the gyrometers and the cameras 14 and 16 being sub- multiples of the CLK frequency of this clock 160. [0010] The gyrometers 102 and the vertical aiming camera 16 are thus configured so that: Fgyro = K.FcamV Fgyro is the acquisition frequency of the gyrometers, 30 FcamV being the acquisition frequency of the images by the camera 16 (typical FcamV = 60 Hz), and K being a positive integer. The fact that K is an integer and that the base clock is the same for the gyrometers and the camera, ensures that there will always be K samples of the Sgyro gyro meter per ScamV signal, without any drift. , angle measurements always falling at the same time. The horizontal camera 14 delivers a ScamH signal to the aforementioned windowing and distortion correction circuits 152 and 156, making it possible to stabilize the video image and to correct various artifacts before it is delivered to the camera. user. The gyrometers 102 deliver a Sgyro signal transmitted to the aforementioned attitude estimation circuit 118 of the drone, which estimates the rotations ço, O and i, ii of the drone in the three axes and transmits them to the circuits 152, 156 and 136. [0011] The vertical camera 16 delivers a Scamv signal applied to the aforementioned circuit 136 which, among other video processing, compensates for rotations from one image to the next according to the estimated rotations delivered to each image by the circuit. 118, before calculating the horizontal velocities Vx and Vy by applying various algorithms, including in particular an optical flow algorithm operating by comparing the apparent displacement between two consecutive images of the terrain overflown by the drone. FIG. 4 illustrates a series of timing diagrams showing: the signal Vsync CamH synchronizing the horizontal camera 14, the signal Vsync Gyro synchronizing the gyros 102, and the signal Vsync CamV synchronizing the vertical camera 16. The rate of the images delivered by the horizontal camera is typically 30 Hz, and that of the images delivered by the vertical camera of 60 Hz. [0012] As can be seen in these timing diagrams, the horizontal camera achieves relatively slow acquisition, acquiring an image beginning on the rising edge of the Vsync CamH signal and ending on the falling edge of that same signal. An image of the horizontal camera therefore corresponds to a large number of samples of the gyro-meters (six samples of the Sgyro signal in the illustrated simplified example, in fact typically 30 samples). As far as the vertical camera is concerned, the acquisitions are made much more rapidly between the rising edge and the falling edge of the Vsync CamV signal, and the velocity estimation algorithm must know as precisely as possible the difference. attitude of the drone between 3034554 10 two successive images. In the illustrated example, this acquisition only corresponds to a single sample of the gyrometric signal (in reality, typically four samples). The variation of the attitude of the drone between two successive images is advantageously calculated with integration of the Sgyro gyrometric signal. delivered by each of the gyrometers (Integr Sgyro chronogram of Figure 4). Above all, it is essential to have an extremely precise synchronization between the Sgyro gyrometric signal and the ScaMV image signal delivered by the vertical camera. [0013] In this respect, although the Sgyro signal of the gyrometric sensor and the Scamv signal of the camera 16 are delivered at the same rate (to an integer multiple), this gives no guarantee as to the phase concordance of these two signals, that is to say, on their true synchronism. More precisely, the common clock ensures that the Sgyro gyro signal does not "slip" with respect to the ScamV video signal. This means that when a new image is available, the same time interval always elapses before the gyro delivers new data. On the other hand, this time interval varies from one drone to another, and from one control sequence to another, because the gyro sensors have not been started at the same time as the video camera. Indeed, the video acquisitions and the gyrometric acquisitions are triggered by software, and it is therefore not certain that the two acquisitions start at the same time, nor that the interval of time separating the two starts is constant of one. drone to another, or from one piloting sequence to another for the same drone. To guarantee perfect synchronization, the invention proposes, in a characteristic way, the use of a hardware component 170, which measures with great accuracy the time offset A between the gyrometric and video signals Sgyro and ScamV (it will be noted that a single measurement is sufficient, since the clocks have been adjusted so that they do not drift). The two mechanisms that have just been described - common clock 160 and hardware circuit 170 for measuring the phase shift - make it possible to connect the gyrometric and video signals in time with a very high degree of accuracy, to a close clock cycle. [0014] The system clock 160 operating at several megahertz, this represents a few nanoseconds of error on the wedging between the empty and gyrometric signals, which is very low and makes it possible to operate an extremely precise and efficient correction. [0015] On the other hand, in the absence of this mechanism, it would have been necessary to record by software the instant of delivery of each new datum of the gyrometer and of each new acquired image. Such a method would be much less precise and more irregular because sensitive to the reaction time of the system, and would provide an accuracy of about 100 ps. 10
权利要求:
Claims (3) [0001] REVENDICATIONS1. A drone (10), comprising: - a vertical aiming video camera (16), able to point towards a scene of the terrain (T) overflown by the drone to capture a sequence of images of this scene and to output a signal corresponding digital image (SCaMV); an inertial unit (154) comprising gyrometric sensors (102) capable of measuring the Euler angles ((0, 0, v) characterizing the attitude of the drone with respect to an absolute terrestrial reference (UVW) and to outputting a gyrometric signal (Sgyro) representative of the instantaneous rotations of the drone; - rotation compensation means (136), receiving at input i) the image signal (Scamv) delivered by the camera and ii) the a gyrometric signal (Sgyro) delivered by the inertial unit, and outputting recalibrated image data compensated for rotations of the drone from one image to the next; and means (138) for estimating the horizontal speeds of the drone from the movements of one image to the next of the scene captured by the camera, said estimating means receiving as input said recalibrated image data, characterized in that: - the camera and the inertial unit are controlled by a common clock (160); the frequency of acquisition of the gyrometric sensors is a multiple of the frequency of delivery of the image signal of the camera; and - there is further provided a circuit (170) capable of: - comparing the gyrometric signal (Sgyro) delivered by the inertial unit with the image signal (ScamV) delivered by the camera; determining the value of the phase shift between these respective signals; applying this phase shift value at the input of the rotation compensation means (136); and the rotation compensation means (136) are furthermore capable of performing, prior to calculating said recalibrated image data, a preliminary correction of resynchronization of the image signal (ScamV) delivered by the camera on the gyrometric signal delivered by the inertial unit. [0002] 2. The drone of claim 1, wherein said circuit (170) capable of comparing the gyrometric signal delivered by the inertial unit with the image signal delivered by the camera, to determine the value of the phase shift between these respective signals, and to apply this phase shift value at the input of the rotation compensation means, is a circuit in hardware technology. 10 [0003] 3. The drone of claim 1, further comprising: integrating means capable of integrating the gyrometric signal delivered by the inertial unit onto the time interval between the delivery of image signals corresponding to two consecutive images; 15 frames of the image sequence.
类似技术:
公开号 | 公开日 | 专利标题 EP3076258B1|2018-08-01|Drone provided with a video camera with compensated vertical focussing of instantaneous rotations for estimating horizontal speeds EP2933775B1|2016-12-28|Rotary-wing drone provided with a video camera supplying stabilised image sequences EP3171589B1|2018-06-06|Drone equipped with a video camera sending sequences of images corrected for the wobble effect EP2400460B1|2012-10-03|Method for assessing the horizontal speed of a drone, particularly of a drone capable of hovering on automatic pilot EP3078402B1|2017-10-04|System for piloting an fpv drone US9280214B2|2016-03-08|Method and apparatus for motion sensing of a handheld device relative to a stylus EP2831685A1|2015-02-04|Method for controlling a multi-rotor rotary-wing drone, with cross wind and accelerometer bias estimation and compensation FR3000813A1|2014-07-11|Rotary wing drone i.e. quadricopter, has image analysis unit implementing Kalman filter estimator having representation of dynamic model of drone, with input of horizontal speed, position, linear acceleration, rotation and altitude signals EP2048475B1|2014-02-26|Method of determining the attitude, position and velocity of a mobile unit EP3112803A1|2017-01-04|Camera unit capable of being installed in a drone for mapping a terrain and method for managing the collection of images by a camera unit EP1623567A1|2006-02-08|Method of transmitting data representing the spatial position of a video camera and system implementing said method EP3273318B1|2021-07-14|Autonomous system for collecting moving images by a drone with target tracking and improved target positioning FR3041136A1|2017-03-17|METHOD FOR DETERMINING EXHIBITION DURATION OF AN ONBOARD CAMERA ON A DRONE, AND ASSOCIATED DRONE JP2017208802A|2017-11-24|Method for encoding and decoding video of drone and related device FR3047064A1|2017-07-28|ALTITUDE ESTIMATOR FOR DRONE CA2361727A1|2000-08-10|Apparatus with gyroscopes and accelerometers for determining the attitudes of an aerodyne US20200322542A1|2020-10-08|Systems and methods for generating composite depth images based on signals from an inertial sensor TWI497034B|2015-08-21|A Satellite Attitude Estimation System and Method FR3075984A1|2019-06-28|METHOD AND SYSTEM FOR REPRESENTING AN AIRCRAFT DRIVING SYMBOLOGY ON A HIGH-DOOR HEAD DISPLAY THAT COMPLIES WITH THE REAL WORLD OUTSIDE VIA A SVS SYNTHETIC VISION SYSTEM FR3052678A1|2017-12-22|DRONE PROVIDED WITH A FRONTAL VIDEO CAMERA COMPRESSING THE INSTANTANEOUS ROTATIONS OF THE DRONE AND CORRECTION OF THE ARTIFACTS FR3071624B1|2019-10-11|DISPLAY SYSTEM, DISPLAY METHOD, AND COMPUTER PROGRAM EP2821896B1|2018-11-28|Method and Apparatus for Motion sensing of a handheld Device relative to a Stylus FR3097316A1|2020-12-18|Method for monitoring the performance of inertial measurement units FR2973108A1|2012-09-28|METHOD FOR UPDATING A NORTHERN ORIENTATION VALUE OR FOR IMPROVING THE INITIALIZATION OF SUCH VALUE IN AN APPARATUS COMPRISING AN IMAGE SENSOR FR3073655A1|2019-05-17|METHOD FOR DETERMINING A VISUAL SPEED VECTOR OF A MOBILE MACHINE, COMPUTER PROGRAM PRODUCT, AND DETERMINING SYSTEM THEREOF
同族专利:
公开号 | 公开日 FR3034554B1|2017-04-28| EP3076258B1|2018-08-01| US20160290809A1|2016-10-06| CN106053874A|2016-10-26| EP3076258A1|2016-10-05| US9797728B2|2017-10-24| JP2016193720A|2016-11-17|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 EP2400460A1|2010-06-22|2011-12-28|Parrot|Method for assessing the horizontal speed of a drone, particularly of a drone capable of hovering on automatic pilot| FR2988868B1|2012-03-30|2015-04-24|Parrot|METHOD FOR CONTROLLING A MULTI-ROTOR ROTOR SAILING DRONE WITH ESTIMATION AND SIDE WIND COMPENSATION|FR3037672B1|2015-06-16|2017-06-16|Parrot|DRONE COMPRISING IMPROVED COMPENSATION MEANS THROUGH THE INERTIAL CENTER BASED ON TEMPERATURE| USD795967S1|2016-01-27|2017-08-29|Horizon Hobby, LLC|Quadcopter| US10212346B2|2016-06-08|2019-02-19|Amazon Technologies, Inc.|Multi-sensor image stabilization techniques| USD830946S1|2016-06-23|2018-10-16|Teal Drones, Inc.|Quadrotor| US10139836B2|2016-09-27|2018-11-27|International Business Machines Corporation|Autonomous aerial point of attraction highlighting for tour guides| EP3529683B1|2016-10-18|2022-02-09|Deakin University|Thrust vectored multicopters| FR3060178A1|2016-12-09|2018-06-15|Parrot Drones|ELECTRONIC DEVICE FOR DRIVING A DRONE, DRONE, STEERING METHOD AND COMPUTER PROGRAM| EP3559597A4|2016-12-20|2020-08-19|Smart Skin Technologies Inc.|Packaging device for measuring motion in manufacture| CN110057352B|2018-01-19|2021-07-16|北京图森智途科技有限公司|Camera attitude angle determination method and device| US20190324447A1|2018-04-24|2019-10-24|Kevin Michael Ryan|Intuitive Controller Device for UAV| CN113141459A|2020-10-16|2021-07-20|北京理工大学|Unmanned aerial vehicle airborne vision intelligent processing system and method|
法律状态:
2016-04-22| PLFP| Fee payment|Year of fee payment: 2 | 2016-10-07| PLSC| Search report ready|Effective date: 20161007 | 2017-04-18| PLFP| Fee payment|Year of fee payment: 3 | 2017-07-21| TP| Transmission of property|Owner name: PARROT DRONES, FR Effective date: 20170616 | 2018-04-23| PLFP| Fee payment|Year of fee payment: 4 | 2020-01-10| ST| Notification of lapse|Effective date: 20191206 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1552813A|FR3034554B1|2015-04-01|2015-04-01|DRONE EQUIPPED WITH A VERTICAL VIEWED VIDEO CAMERA COMPENSATED WITH INSTANTANEOUS ROTATIONS FOR ESTIMATING HORIZONTAL SPEEDS|FR1552813A| FR3034554B1|2015-04-01|2015-04-01|DRONE EQUIPPED WITH A VERTICAL VIEWED VIDEO CAMERA COMPENSATED WITH INSTANTANEOUS ROTATIONS FOR ESTIMATING HORIZONTAL SPEEDS| EP16162801.1A| EP3076258B1|2015-04-01|2016-03-30|Drone provided with a video camera with compensated vertical focussing of instantaneous rotations for estimating horizontal speeds| US15/087,883| US9797728B2|2015-04-01|2016-03-31|Drone provided with a vertical-view video camera compensated for the instantaneous rotations for estimation of the horizontal speeds| JP2016070013A| JP2016193720A|2015-04-01|2016-03-31|Drone comprising vertical view video camera having momentary rotation compensated for estimation of horizontal speed| CN201610403579.5A| CN106053874A|2015-04-01|2016-04-01|Drone provided with a vertical-view video camera compensated for the instantaneous rotations for estimation of the horizontal speeds| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|