专利摘要:
A method of automatically piloting a rotary wing drone to operate camera movements by an onboard camera. The subject of the invention is a method of autonomously controlling, by means of a "base station", a multiple-rotor rotary wing drone to control the drone in attitude and speed following a selected camera movement and a position of the subject to be filmed. this method comprises the following steps: 1. Selection by the user of a camera movement (6) defined by a set of parameters comprising: shooting mode in fixed point or in displacement (7); type of displacement in attitudes to the subject to be filmed (8); movement speed ; directions or axes of movement; direction of movement (10); shooting altitudes (9); 2. Generation of position commands (15) by which the drone will pass from said set of parameters (12) and the instantaneous position of the subject to be filmed (13) and its recent trajectory (14); 3. Activation of the shooting by the video camera once the drone launched on the positions sent by the base station embarked on the subject. The device according to the invention is particularly intended for aerial photography.
公开号:FR3031402A1
申请号:FR1500002
申请日:2015-01-02
公开日:2016-07-08
发明作者:Sylvain Montreuil
申请人:Squadrone System;
IPC主号:
专利说明:

[0001] The invention relates to rotary wing UAV control systems such as quadricopters, hexacopters and the like. These drones are provided with multiple rotors driven by respective engines controllable in a differentiated manner to control the drone attitude and speed. These drones are piloted by the user by means of a remote control device - hereinafter referred to as "base station" - connected to the drone by a radio link. A typical example of such a system is the IRIS 3D Robotics, USA, which is a quadricopter equipped with a series of sensors (accelerometers, three-axis gyrometers, barometer, GPS), a camera stabilization system, a camera capturing an image of the scene to which the drone is directed, an RC remote control, and a smartphone or tablet application with software "base station" communicating with the drone in radio link. More specifically, one of the navigation modes of the drone is called automatic mode. In this mode, the drone is directed by the user by means of the "base station" which may be a computer, a telephone equipped with "base station" software, an iPad type multimedia tablet (trademark of Apple Inc, USA), or any other object connected by sending positions through which the drone must pass. These crossing points are GPS coordinates and an altitude that are pre-defined before the flight of the drone. The user manually defines the passage points of the drone through the "base station" or in a programmed manner before the flight of the drone, or during the flight to influence the behavior of the drone. Another navigation mode is called manual or assisted mode. In this mode, the drone is piloted by the user by means of signals emitted by a remote control that will be translated by the drone in movements. For example, to advance the drone the user leans the pitch lever upward. In this way, the drone is controlled so as to tilt or "dive" downwards 25 (inclination at a pitch angle), it will progress forward with a speed that is higher as the inclination important. Conversely, if it is controlled so as to "pitch up" in the opposite direction, its speed will gradually slow down and then reverse by going backwards. Similarly, for inclination control along a roll axis the drone will lean to the right or left, causing a linear displacement in horizontal translation to the right or to the left. The navigation techniques mentioned are unfortunately not applicable or not applicable to the video image capture by means of the camera on a fixed subject or in motion for the particular reasons mentioned here: 1. Insofar as the drone is controlled manually, the taking of point of view relative to a subject 35 is a delicate operation requiring a great dexterity and a very good drone piloting experience, and is therefore not accessible to the neophyte or amateur public; -2- 2. The use of the automatic navigation mode requires to prepare a flight plan in advance and therefore does not allow to be reactive with respect to a behavior of the moving subject not defined in advance. The problem of the invention is to find another drone control technique that makes it possible to produce a video image sequence on a fixed or mobile subject without having manual or automatic intervention to control the drone. The basic idea of the invention is to use a "base station" embedded on the subject to be filmed that can communicate autonomously with the drone to perform a video image sequence. For this, the user chooses a "camera movement" within a library on the application "base station" which automatically defines for the drone a corresponding trajectory according to the position of the subject adapted to achieve the chosen shot. The subject can then move, the drone control system automatically adapting to the new positions of the subject. More specifically, the subject of the invention is a method of autonomously controlling, by means of a "base station", a rotary-rotor rotary-wing drone for steering the drone in attitude and speed following a camera movement. chosen and a position of the subject to be filmed. The drone comprises an on-board video camera capable of capturing a sequence of images of a target viewed from the drone and transmitting this sequence to the "base station". Typically, this method comprises the following steps: 1. Selection by the user of a camera movement defined by a set of parameters including: shooting mode fixed point or moving; type of displacement in attitudes to the subject to be filmed; movement speed ; directions or axes of movement; sense of displacement; shooting altitudes; 2. Generation of position commands by which the drone will pass from said set of parameters and the instantaneous position of the subject to be filmed and its recent trajectory; 3. Activation of the shooting by the video camera once the drone launched on the positions sent by the base station embarked on the subject. An embodiment of the device of the invention will now be described with reference to the appended drawings in which the same reference numerals designate identical or functionally similar elements from one figure to another. Figure 1 is an overview of the system showing the drone, a remote control and the "base station" for its control. Figure 2 is an example of a "base station" application on a smartphone used for automatic missions by preprogramming the GPS coordinates through which the drone is to pass. Figures 3 and 4 are examples that illustrate a way of navigating within a motion library and defining the corresponding set of parameters. - 5 - Figure 5 describes the implementation of the invention by which the automatic control of the drone is through a selected camera movement and related parameters, and the position of the subject to be filmed. Figure 6 depicts an example of camera motion composition.
[0002] In Figure 1, the reference (1) generally designates a drone, which is for example a hexacopter. This drone comprises six coplanar rotors (4) whose engines are controlled independently by an integrated navigation system and attitude control. The drone (1) also comprises a front camera (11) for obtaining an image of the scene to which the drone is directed.
[0003] The drone can be controlled by a remote control (2) or by a "base station" (3). In Figure 1, the example base station is a smartphone (3) equipped with a suitable application. This base station (3) can also be a tablet, a multimedia player or any other connected device provided with radio link means with the drone for the bidirectional exchange of data: from the drone (1) to the base station (3). ) especially for the position of the drone and the transmission of the image captured by the camera (11), and the base station (3) to the drone (1) for sending pilot commands. Figure 2 shows an example of mission preprogrammed on the base station (3) in automatic mode by GPS coordinates (5) through which the drone will pass. Associated with these coordinates is an altitude of passage in the currently available systems.
[0004] In Figure 3 is shown an example of the base station application (3) on a smartphone allowing the user to navigate within a library of camera movements (6). The user can choose the type of viewpoints and movement of the camera he wants. In Figure 4 is shown an example of details on the base station (3) on a smartphone for a given camera movement (6) to specify a number of parameters related to the desired movement by the user. The camera movement exemplified here is a 360 selfie. In this example these parameters are: the relative position of the camera relative to the subject (7) in a horizontal plane, a relative distance (8) of the camera relative to the subject allowing the user to choose the type of desired plane , the altitude evolution (9) of the desired camera allowing to have an aerial point of view of the subject to be filmed and the speed and direction of rotation (10) of the camera around the subject. In Figure 5 we describe in detail the steps of implementation of the invention. From the camera movement and the set of parameters (12) chosen by the user on the base station, the position of the subject is evaluated at regular intervals (13) from sensors embedded on the latter. These sensors can be linked to the base station, to the multimedia device hosting this application or to a device dedicated to the location of the subject. The location of the subject (13) is through the hybridization of data from various sensors, some of which may be for example: a GPS sensor, an inertial unit containing three accelerometers, three gyroscopes, three magnetometers and a barometer. The location of the subject is either in a plane in two dimensions (horizontal plane) or in three dimensions if the sensors allow it. From the location data of the subject (13) thus collected at regular intervals, a subject trajectory prediction mechanism (14) is implemented to evaluate the estimated trajectory of the subject to be filmed. The elements taken into account for the trajectory prediction may depend on the type of camera movement and the selected set of parameters (12) and may be more or less complex depending on the types of sensors available and the desired accuracy. A simple example of trajectory prediction can be realized by using a velocity vector (direction / heading speed torque) of the subject that is applied to its current position to estimate its future position and its displacement in time. The goal is to anticipate the position of the target according to its speed. The measurement of the speed of the target is done by measuring the distance traveled by the target during a given period of time. This measurement must be carried out at a frequency sufficient for the drone to react quickly enough, but not too quickly so that the measurement is sufficiently precise. If we consider that the GPS position is accurate to the nearest meter and that we want to evolve with a speed ranging from zero to ten m / s, an interesting compromise could be to make speed measurements every second. The predicted position of the subject then makes it possible to define the target position of the drone for the moment to come as well as the parameters of its displacement (15). These parameters are for example the horizontal speed of displacement, the vertical speed of displacement, the speed of rotation of the drone. The target position is typically a GPS point, an altitude and a desired orientation for the drone at a given time. The orientation of the camera on the pitch axis is also part of the target position so that the axis of the camera is always towards the subject to be filmed and that it is framed. This calculation of the position and the displacement elements of the drone also depend on the camera movement and the selected set of parameters (12) as well as the position and attitudes of the drone (17).
[0005] This target position and the displacement elements are transmitted in the form of instructions to the drone (16) through a communication protocol and a radio link allowing communication between the drone and the base station. This radio link can be for example a Wi-Fi wireless network (IEEE 802.11) or Bluetooth (registered trademarks). The communication protocol for transmitting these instructions from the base station (3) to the drone (1) can be for example a standardized protocol such as the MAVLINK protocol developed by ETH Zurich, Switzerland. These position and displacement instructions are then taken into account by the flight controller of the drone which will actuate the controllers for controlling the coplanar rotors (4) which will cause the movement of drone to the target position. A servo loop (17) is then executed to correct the real trajectory of the drone according to the target trajectory.
[0006] This entire process is iterated at regular intervals. The frequency of this process can typically be 10 Hz, and varies depending on the types of sensors and the type of camera movement chosen (12). In Figure 6, a camera trajectory or motion is described with a set of "fi-ame" tags (18). Each frame has a unique identifier (18), a position relative to the target on a 3D plane (x, y, z) (19) and a time corresponding to the moment of passage of the drone by this point (20).
权利要求:
Claims (5)
[0001]
REVENDICATIONS1. A method of controlling a drone (1) by means of a remote control device (3), this drone being equipped with a video camera (11), the apparatus (3), here called base station, being a portable apparatus comprising: a touch screen; means capable of detecting at least one touch of a user's finger on the surface of the screen, a means of locating a subject to be filmed (onboard or off-board sensors), and wireless data transmission means, able to issue commands to the drone; This method comprises the following steps: - Selection by the user of a camera movement (6) defined by a set of parameters (12) comprising: shooting mode fixed point or moving; type of displacement in attitudes to the subject to be filmed (7); movement speed ; directions or axes of movement; direction of movement (10); 15 shooting altitudes (9); - Generating position commands by which the drone will pass from said set of parameters (12) and the instantaneous position of the subject to be filmed and its recent trajectory (14); - Activation of the shooting by the video camera once the drone launched on the 20 positions sent by the base station embedded on the subject (16) (17). Characterized in that: a. The drone moves automatically according to the trajectory of the subject (13) and the prediction of its trajectory (14), b. to perform a predefined camera movement associated with a set of parameters (12), c. the calculation of the position and the displacement elements of the drone depend both on the camera movement to be performed, on the position of the subject to be filmed and on the location and attitudes of the drone (15).
[0002]
2. The method of claim 1, wherein: the user can choose, by means of a graphical interface, a camera movement that he wishes to carry out within a library (6); defining a set of parameters relating to this camera movement to define the relative position of the camera relative to the subject (7) in a horizontal plane, a relative distance (8) from the camera to the subject allowing the user to choose the desired type of plane, the altitude evolution (9) of the desired camera allowing to have an aerial point of view of the subject to be filmed and the speed and direction of rotation (10) of the camera around the sujet.-7-
[0003]
The method of claim 1 wherein a camera movement and the set of associated parameters (12) is performed by a drone (1) by means of commands sent by the base station (3) after evaluating the location of the subject (13) from sensors embedded on said subject, predicts its trajectory (14) and calculates the position and the target displacement elements of the drone (15) from the location and the predicted trajectory of said subject so that the axis of the camera is directed towards the subject to be filmed and that said subject is framed.
[0004]
4. The method of claim 3 wherein the drone moves to a target point setpoint (16) according to the position and movement of the subject to be filmed and the camera movement to be performed (6).
[0005]
5. The method of claim 4, wherein the trajectory of the drone is corrected by means of a servo-control loop (17) as a function of the position and the target displacement elements of the drone calculated from the location and the predicted trajectory of said subject.
类似技术:
公开号 | 公开日 | 专利标题
FR3031402A1|2016-07-08|METHOD OF AUTOMATICALLY CONTROLLING A ROTARY SAILING DRONE FOR OPERATING CAMERA MOVEMENTS BY AN EMBEDDED CAMERA
US10860040B2|2020-12-08|Systems and methods for UAV path planning and control
JP6735821B2|2020-08-05|System and method for planning and controlling UAV paths
EP2613214B1|2017-08-02|Method for controlling a rotary-wing drone to operate photography by an on-board camera with minimisation of interfering movements
US20210072745A1|2021-03-11|Systems and methods for uav flight control
JP6816156B2|2021-01-20|Systems and methods for adjusting UAV orbits
US11233943B2|2022-01-25|Multi-gimbal assembly
EP2497555B1|2013-08-28|Method for piloting a rotary-wing drone with multiple rotors with curved steering.
EP3316068B1|2019-03-06|Autonomous system for collecting moving images by a drone with target tracking and upkeep of the target collection angle
EP2613213A1|2013-07-10|Intuitive piloting method of a drone by means of a remote control device
WO2018098704A1|2018-06-07|Control method, apparatus, and system, unmanned aerial vehicle, and mobile platform
EP3276591A1|2018-01-31|Drone with an obstacle avoiding system
EP3273318B1|2021-07-14|Autonomous system for collecting moving images by a drone with target tracking and improved target positioning
EP3306428A1|2018-04-11|Autonomous system for collecting moving images by a drone with target tracking and upkeep of the target collection angle
EP3273317A1|2018-01-24|Autonomous system for taking moving images, comprising a drone and a ground station, and associated method
JP2019511044A|2019-04-18|Method and system for motion control of a flying device
WO2018187916A1|2018-10-18|Cradle head servo control method and control device
EP3281871A1|2018-02-14|Method for capturing a video with a fixed-wing drone, related computer program and electronic system
Lu et al.2017|iOS application for quadrotor remote control
EP3392728A1|2018-10-24|Method for piloting a rotary wing drone, related computer program, electronic apparatus and drone
FR2922666A1|2009-04-24|Modular exploiting and managing system controlling method for e.g. drone, involves selecting navigation data to be considered for managing platform according to mission data derived from ground station by automation block
WO2018227345A1|2018-12-20|Control method and unmanned aerial vehicle
EP3368957B1|2022-02-09|Systems and methods for uav path planning and control
US20200221056A1|2020-07-09|Systems and methods for processing and displaying image data based on attitude information
WO2020042062A1|2020-03-05|Drift control method and device for ground remote control robot, and ground remote control robot
同族专利:
公开号 | 公开日
FR3031402B1|2018-09-07|
US20160194079A1|2016-07-07|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
EP3273317A1|2016-07-22|2018-01-24|Parrot Drones|Autonomous system for taking moving images, comprising a drone and a ground station, and associated method|
WO2018098898A1|2016-11-29|2018-06-07|深圳市元征科技股份有限公司|Terrestrial magnetism-based unmanned aerial vehicle control method, and unmanned aerial vehicle|US8718838B2|2007-12-14|2014-05-06|The Boeing Company|System and methods for autonomous tracking and surveillance|
US8639396B1|2008-10-08|2014-01-28|Raytheon Company|Cooperative control of unmanned aerial vehicles for tracking targets|
US9930298B2|2011-04-19|2018-03-27|JoeBen Bevirt|Tracking of dynamic object of interest and active stabilization of an autonomous airborne platform mounted camera|
US9367067B2|2013-03-15|2016-06-14|Ashley A Gilmore|Digital tethering for tracking with autonomous aerial robot|
US20150234055A1|2014-02-20|2015-08-20|Javad Gnss, Inc.|Aerial and close-range photogrammetry|
CN107577247B|2014-07-30|2021-06-25|深圳市大疆创新科技有限公司|Target tracking system and method|US11266919B2|2012-06-29|2022-03-08|Monkeymedia, Inc.|Head-mounted display for navigating virtual and augmented reality|
JP6408832B2|2014-08-27|2018-10-17|ルネサスエレクトロニクス株式会社|Control system, relay device, and control method|
EP3101889A3|2015-06-02|2017-03-08|LG Electronics Inc.|Mobile terminal and controlling method thereof|
TWI557526B|2015-12-18|2016-11-11|林其禹|Selfie-drone system and performing method thereof|
US20170214856A1|2016-01-22|2017-07-27|Mediatek Inc.|Method for controlling motions and actions of an apparatus including an image capture device having a moving device connected thereto using a controlling device|
US20170272491A1|2016-03-16|2017-09-21|Mesa Digital, LLC.|Self-contained and portable synchronized data communication system and method for facilitating the wireless transmission of video and data from venues to client devices|
US10382539B1|2016-06-01|2019-08-13|Cape Productions Inc.|Methods and apparatus for data control and transfer with an unmanned aerial vehicle|
US10464669B2|2016-06-24|2019-11-05|Cisco Technology, Inc.|Unmanned aerial vehicle collision avoidance system|
US10417755B1|2016-11-18|2019-09-17|Talon Aerolytics , Inc.|Drone-based inspection of wireless communication towers and corresponding methods, systems, and apparatuses|
KR20180075191A|2016-12-26|2018-07-04|삼성전자주식회사|Method and electronic device for controlling unmanned aerial vehicle|
US10279825B2|2017-01-10|2019-05-07|General Electric Company|Transfer of vehicle control system and method|
US10168704B2|2017-06-05|2019-01-01|Hanzhou Zero Zero Technology Co., Ltd.|System and method for providing easy-to-use release and auto-positioning for drone applications|
WO2019029551A1|2017-08-10|2019-02-14|Hangzhou Zero Zero Technology Co., Ltd.|System and method for obstacle avoidance in aerial systems|
JP2019053425A|2017-09-13|2019-04-04|富士ゼロックス株式会社|Information processing device and program|
TWI657011B|2017-11-30|2019-04-21|財團法人工業技術研究院|Unmanned aerial vehicle, control system for unmanned aerial vehicle and control method thereof|
CN108325216B|2017-12-29|2020-08-14|黄巧香|Position capturing method and device of remote control toy, user terminal and system|
CN108928480A|2018-07-05|2018-12-04|武汉捷特航空科技有限公司|A kind of unmanned plane being automatically separated storehouse with photographic equipment waterproof|
DE102018120010A1|2018-08-16|2020-02-20|Autel Robotics Europe Gmbh|ROUTE DISPLAY METHOD, DEVICE AND SYSTEM, GROUND STATION AND COMPUTER READABLE STORAGE MEDIUM|
法律状态:
2016-02-01| PLFP| Fee payment|Year of fee payment: 2 |
2016-07-08| PLSC| Publication of the preliminary search report|Effective date: 20160708 |
2017-01-12| PLFP| Fee payment|Year of fee payment: 3 |
2018-01-30| PLFP| Fee payment|Year of fee payment: 4 |
2020-01-27| PLFP| Fee payment|Year of fee payment: 6 |
2021-01-13| PLFP| Fee payment|Year of fee payment: 7 |
2022-01-28| PLFP| Fee payment|Year of fee payment: 8 |
优先权:
申请号 | 申请日 | 专利标题
FR1500002A|FR3031402B1|2015-01-02|2015-01-02|METHOD OF AUTOMATICALLY CONTROLLING A ROTARY SAILING DRONE FOR OPERATING CAMERA MOVEMENTS BY AN EMBEDDED CAMERA|
FR1500002|2015-01-02|FR1500002A| FR3031402B1|2015-01-02|2015-01-02|METHOD OF AUTOMATICALLY CONTROLLING A ROTARY SAILING DRONE FOR OPERATING CAMERA MOVEMENTS BY AN EMBEDDED CAMERA|
US14/985,979| US20160194079A1|2015-01-02|2015-12-31|Method of automatically piloting a rotary-wing drone for performing camera movements with an onboard camera|
[返回顶部]