专利摘要:
The system includes a drone and a station on the ground with a console adapted to be directed towards the drone, and virtual reality glasses rendering images taken by a camera of the drone. The system further comprises means for modifying the framing of the images taken by the camera according to framing instructions received from the ground station. It further comprises means (302-324) relative heading determination for periodically developing an angular difference between the orientation of the glasses and the orientation of the console, and means (316) for developing framing instructions. to the drone according to said angular difference. This avoids unexpected changes of framing when the pilot simply turns the console and his entire body, including head, towards the drone to follow him in his movements.
公开号:FR3034681A1
申请号:FR1553091
申请日:2015-04-10
公开日:2016-10-14
发明作者:Henri Seydoux;Nicolas Brulez;Arsene Ferrand
申请人:Parrot SA;
IPC主号:
专利说明:

[0001] The invention relates to rotary wing drones such as quadcopters and the like. These drones are provided with multiple rotors driven by respective engines controllable in a differentiated manner to control the drone attitude and speed. The AR.Drone 2.0 or the Bebop Drone of Parrot SA, Paris, France are typical examples of such quadricopters. They are equipped with a series of sensors (accelerometers, three-axis gyrometers, altimeters), a front camera capturing an image of the scene to which the drone is pointing, and a vertical aiming camera capturing an image of the terrain. overflown. The VVO 2010/061099 A2 and EP 2 364 757 A1 (Parrot SA) describe such a drone and its driving principle via a phone or multimedia player with touch screen and built-in accelerometer, for example a cellular phone. iPhone type or iPad type multi-media tablet (trademarks of Apple Inc., USA). These devices incorporate the various control devices necessary for the detection of control commands and the bidirectional exchange of data with the drone via a Wi-Fi (IEEE 802.11) or Bluetooth (registered trademarks) wireless network link. They are also provided with a tactile screen displaying the image captured by the front camera of the drone, with a number of symbols in order to activate commands by simply touching the user's finger on the camera. this touch screen. The front camera of the drone can be used for piloting in "immersive mode" or FPV (First-Person View), that is to say where the user uses the image of the camera in the same way as if it was itself on board the drone, it can also be used to capture sequences of images of a scene to which the drone is moving, so that the user can use the drone in the same way as a camera or a camcorder which, instead of being held by hand, would be carried by the drone The images collected can be recorded, broadcast, put on line on web hosting sites of video sequences, sent to other Internet users, shared on social networks, etc. In the Bebop Drone system marketed by the plaintiff, the driver directly displays the image taken by the drone on the ground station 3034681 2 marketed under the name Skycontrotter, and he can control the angle of view of the camera using the i touch screen of the iPad-type tablet on the ground station. This viewing angle control acts on a digital reframing and distortion correction processing applied to the fisheye-type camera image, so as to simulate a physical movement of the imaging axis. the camera to the left, right, up and down. In addition, the ground station has a video output for connecting FPV goggles for immersive mode control.
[0002] This immersive mode however reaches its limits insofar as the displayed image is only dependent on the user's tactile commands, regardless of the posture of the user and in particular of his head. The present invention aims, in a system comprising a drone, a pilot ground station and glasses type FPV, to naturally guide the drone while allowing the user, in a particularly simple, intuitive way and not disturbing, to modify the framing of the image restored in the video image. For this purpose, the invention proposes an immersive drone control system comprising a drone equipped with shooting means and a ground station communicating with the drone via a wireless link. The ground station includes a console adapted to be directed towards the drone to maintain the quality of the wireless link and virtual reality glasses rendering images taken using the means of shooting and transmitted from the drone through the wireless link. The system further comprises means for modifying the framing of the images taken by the shooting means according to framing instructions received from the ground station. The system is characterized in that it further comprises: - relative heading determination means for periodically developing an angular difference between the orientation of the glasses and the orientation of the console, and - means to develop framing instructions for the drone according to said angular difference. This avoids unexpected changes of framing when the pilot simply turns the console and his entire body, including the head, 3034681 3 in the direction of the drone to follow it without its movements, and it causes a reframing only when the pilot's head turns, yaw (laterally) and preferably also pitch (up and down), relative to the console and therefore relative to the whole of his body, in relation to which the console is held and / or attached. The system also optionally includes the following advantageous features, taken individually or in any combination which those skilled in the art will recognize as technically compatible: the relative heading determining means comprises an inertial unit associated with the ground station and processing means for determining a heading of the console relative to a reference; the inertial unit associated with the console comprises a magnetometer, and the reference is constituted by magnetic north; The relative heading determining means comprise an inertial unit associated with the spectacles and processing means for determining an eyeglass cap relative to a reference; the inertial unit associated with the spectacles comprises a magnetometer and the reference is constituted by magnetic north; The heading determining means comprise means for estimating the heading of the spectacles by integrating speed and / or acceleration data generated in the spectacles, initialization means for determining an offset between the console heading and the estimated heading of the spectacles when a physical correspondence of course between the console and the spectacles is triggered by an operator, and means for determining a corrected heading of the spectacles from the estimated course and the offset; the relative heading determining means comprise subtraction means between the angular values of the console cap and the cap of spectacles; the orientation of the console and the orientation of the spectacles include at least one yaw and / or pitch orientation; the means for modifying the frame comprise digital processing means for reframing and correcting a fixed video image of the fisheye type taken by the shooting means; The system further comprises means for superimposing on the image rendered in the virtual reality glasses a heading indicator of the drone.
[0003] An embodiment of an image transmission system between a drone and a drone ground station according to the invention will now be described with reference to the appended drawings, in which the same references denote a figure in FIG. other identical or functionally similar elements. Figure 1 schematically illustrates a set of drone and commercial ground station, which advantageously applies the invention. Figures 2a and 2b illustrate the assembly of Figure 1 supplemented with immersive mode goggles according to the invention for two different angular positions of the user's head. Figure 3 is a logic diagram of the essential functions performed by the system of the invention.
[0004] We will now describe an embodiment of the device of the invention. A first-person viewer system according to the present invention comprises a drone equipped with at least one camera, a ground station communicating via an appropriate range wireless link with the drone. and virtual reality glasses, equipped with means of restitution before the eyes of the user of images giving him the sensation of flying on board the drone, in the most realistic way possible (First Person View flight or FPV) . In this example, the drone comprises a single camera with fisheye optics 30, but could alternatively comprise a set of cameras, and digital processing means on board and / or provided in the ground station for combining the images. various cameras. In FIG. 1, reference numeral 10 generally designates the drone, for example a quadrocopter type flying drone such as the Bebop model 3034681 of Parrot SA, Paris, France. This drone comprises four coplanar rotors 12 whose engines are controlled independently by an integrated navigation system and attitude control. The drone also includes a front-facing camera 14 which makes it possible to obtain a video image of the scene towards which the drone is heading, for example a high definition CMOS sensor camera with a resolution of 1920 x 1080 pixels with a refresh rate. a video stream of 30 fps (frames per second), combined with a fisheye wide-angle lens. The drone also has image processing and correction means capable of real-time or near-real time reframing the images taken by the drone as a function of a direction in which the pilot wishes to wear his gaze, as we are going to see in the following. The drone is controlled by a ground station 16 comprising a remote control console 20 associated with a tablet 18.
[0005] The tablet 18 is a conventional tablet, for example of the iPad (trademark) type or the like comprising a display screen 22, and in which was loaded a software application module for driving the drone 10 and for viewing the images taken by the camera. The tablet 18 is also equipped with a data exchange radio interface 20, for example of the Wi-Fi type. It is a tablet of a standard model, unmodified if this is the loading of the application software module. The screen 22 of the tablet 18 displays the image of the scene captured by the on-board camera of the drone 14, with in superposition a number of symbols allowing the control of various navigation parameters. This screen is a touch screen for activating control commands by simply touching an operator's finger on the symbols displayed on the screen (up / down, attitude control, etc.). These actions are interpreted by the application software which transforms them into control signals intended for the drone, sent on the Wi-Fi data exchange interface. Conversely, the flight data from the drone 10 received by the Wi-Fi interface are delivered to the application software for processing there and possibly be displayed on the screen 22 of the tablet.
[0006] The production of commands from the touch screen of the tablet is here only optional, the various commands being duplicated and completed by actions on various buttons and joysticks of the console 20. More specifically, the console 20 comprises a support 24 for securing the tablet 18 to the console, facing the operator so that it can see the entire surface of the screen 22. The body of the console 20 is extended by each side by handles 26 provided with various control members such as joysticks 28 of the "joystick" type and buttons 30 arranged within reach of the operator's fingers 10 and each associated with a particular command (take-off / landing, return to the starting point, trigger recording of the scene captured by the camera, etc.). The console 20 is also provided with various illuminated displays 32 (drone battery level, recording in progress, etc.), as well as a connector (not shown) allowing the connection of accessories including reality glasses. Virtual integrated screen can be used as an alternative to the display screen 22 of the tablet. Alternatively, the accessories can communicate with the console by wireless communication. The console 20 is also provided with a long range directional directive 34 Wi-Fi antenna, which the operator directs towards the drone 10 when he has in his hands the remote control console 20. This Wi connection -Fi especially vehicle to the drone control commands to the drone 10, the angular component data direction in which the pilot in immersion wishes to watch the scene 25 taken by the camera 14 (as will be described in the following ), and from the drone the video signal from the camera, cropped here in the drone by the processing means mentioned above. Finally, the console 20 is equipped with an inertial unit 38, preferably comprising a 3-axis accelerometer, a 3-axis magnetometer and a 3-axis gyrometer. Referring to Figures 2a and 2b, there is shown the console 20 and virtual reality glasses 40 connected to the console by a wired or wireless link as described above, constituting a bidirectional communication channel.
[0007] For example, the glasses are of the Oculus Rift (trademark) type sold by the company Oculus, Menlo Park, California, USA, connected to the console 20 by a wired HDMI and / or USB connection, or even type glasses. Zeiss VR One (registered trademark), the latter incorporating a mobile phone replacing the tablet 18 and the link with the console 20 then being a wireless link. The console 20 is able to communicate in a rising direction with the glasses 40 so that they display a video image. In a simplest embodiment, this image is a copy of the image displayed on the display screen 22 of the tablet. In this first embodiment, the glasses 40 are equipped with an inertial unit designated schematically by the reference 42, this inertial unit comprising, in a manner known per se, a 3-axis gyrometer, a 3-axis accelerometer and a 3-axis magnetometer.
[0008] According to the invention, the measurements delivered by the inertial unit are transmitted to the console 20 via the downward communication channel so as to control the framing of the image restored by the camera 14 of the drone 10. The general principle of the The invention consists, when the pilot turns his head, of using the signals supplied by the inertial unit 42 to determine the movement of the head and to derive an angle of rotation therefrom, preferably consisting of an angular component in pitch. (up, down) and an angled yaw component (to the left, to the right).
[0009] The console transmits the values of these two angular components to the drone 10 so that the latter performs the processing of reframing and correction of the wide-angle image taken by its camera 14 to adapt the restored image to the direction in which is turned the head of the pilot.
[0010] Expressed as such, this feature is however problematic. Indeed, the drone system 10 and ground station 16 described above requires that the antenna 34 of the console is properly oriented, at least approximately in the direction of the drone. This causes the user to rotate his entire body which supports the ground station 16, and therefore his head, 35 to follow the position of the drone when it moves.
[0011] 3034681 8 It must be avoided that such a rotation is interpreted by the system as a rotation of the head in order to look left or right, and / or up or down. If the framing of the image thus followed the rotation of the body of the entire pilot with the console, the pilot would rapidly lose his bearings and the drone would become impossible to fly in the desired direction. According to the invention, to modify the framing of the image taken by the camera 14 of the drone, the system will determine a relative angular position, in pitch and yaw, between the virtual reality glasses 40 and the console 20, and control the framing of the image from these data. The various operations implemented for this purpose will now be described in detail with reference to FIG. 3. First, at step 302, the system determines whether the glasses 40 connected to the ground station are equipped with autonomous means for determining the absolute orientation of the glasses (relative to the magnetic north), namely an inertial unit equipped with a magnetometer and an accelerometer, and possibly a gyrometer. If so, in step 304, the data from the magnetometer is reprojected to the gravity information obtained with the accelerometer data, which makes it possible to determine the heading of the glasses, both in pitch and in yaw. in step 306. This data is transmitted to the ground station 16 via the downlink communication channel. In the ground station, data from the inertial unit 38 incorporated in the console 20 is operated at step 308 to generate a heading value of the console relative to the magnetic north at step 310. The value of Glasses cap is retrieved from the downlink transmission channel at step 312, is subtracted from the heading value of the console at step 314.
[0012] In the case where this difference is zero, at an error rate of, for example, a few degrees (case of FIG. 2a), this means that the pilot is looking straight ahead, and the framing of the fisheye image taken by the camera is performed on the optical axis of the camera, that is to say centrally. This is illustrated in the upper part of FIG. 2a by the display, superimposed on the image, of a heading indicator of the drone 3034681 9 CD, for example in the form of a circle, corresponding to the optical axis. of the camera and the main axis of displacement of the drone 10, and therefore at the optical center of the image. This indicator is here centered in the displayed image (both on the display 22 of the tablet 18 and on the display of the glasses 44). In the case where the pilot has turned his head to the left without turning his body (the heading of the console does not change), as shown in Figure 2b, then step 314 determines a corresponding difference between the heading of glasses and the cap of the console. This difference is materialized, in this case where the caps are determined both in pitch and yaw, by an angular difference component in pitch and an angular component of difference in yaw, and these components, in step 316, are sent to the drone 10 in step 316 so that the image processing in the latter ensures a corresponding cropping of the fisheye image taken by the camera 14, this image being transmitted back to the ground station 16, which itself transmits it to the virtual reality glasses 40. This is illustrated in the upper part of Figure 2b by the fact that the cropped image displayed on the display device 22 and on the glasses 40 is located to the left of the heading indicator of the drone CD. The differential heading measurement described in the preceding steps is carried out at regular intervals, for example from 10 to 100 times per second, for each time adjusting the framing of the image taken by the camera 14 and restoring the cropped image on the station on the ground 16 and in the glasses 40.
[0013] By contrast, when the entire body and the head of the pilot, as well as the ground station 16, rotate in the same block (in particular to adjust the orientation of the antenna 34 towards the drone 10), no cropping of the image is caused. An alternative embodiment of the invention will now be described, useful in the case where the virtual reality glasses are not equipped with a magnetometer. This implementation can be implemented for example by providing in the software embedded in the tablet 18 a type of glasses recognition module from which the system is able to determine what type of position detection means and / or movement and / or acceleration equips the glasses connected to the station 16 (themselves, in a conventional manner per se, having an unchanging identifier of equipment type). Thus, in the present case, contrary to the process described above, step 302 determined that the glasses connected, this time of a different model, were equipped with a gyrometer and an accelerometer, but not a magnetometer (step 318). It is understood that in this case, the system is unable to determine an absolute heading for glasses.
[0014] In this case, the system performs regularly (step 320) an estimation of the heading of the glasses by performing a temporal integration of the information from the accelerometer and the gyrometer. In step 322, an initialization is performed, according to which the pilot is asked to align the heading of the goggles (determined by the direction of the head) with the heading of the console 20 and then to press a button console 20 (or on a specific touch zone of the tablet 18). At this moment, a difference between the estimated heading of the glasses 40 and the heading of the console 20 measured at that moment is determined and stored to take place as offset value, that is to say offset (step 324).
[0015] Subsequently, the current course value is calculated at step 306 by applying to the estimated heading of the glasses, still obtained in step 320, the offset determined in step 324. Thereafter the process proceeds as described previously. Of course, the present invention is in no way limited to the embodiments described and shown, and those skilled in the art will be able to provide many variations and modifications. In particular: - it applies to drones of various types, inspection, leisure or other, hovering or not; it applies to all types of types of virtual reality glasses, incorporating an inertial unit, or at least one accelerometer and a gyroscope (2-axis or 3-axis), or on which is fixed a separate module having a inertial unit, or at least one accelerometer and a gyrometer (2 axes or 3 axes); the reframing of the image as a function of the position of the head relative to the console is effected at least according to the yaw angle, and preferably according to the angle of yaw and the angle pitch, and if necessary also taking into account the roll angle; the reframing of the image may be of software type from a fisheye image taken by the video camera 14 (with a manner known in itself of curvature correction processing), or of material type, the camera being then mounted for example on a motorized nacelle, the optical viewing angle of the camera being adjusted according to the pitch and yaw angular components received from the ground station at step 316, the drone can be equipped with not a single camera, but a set of cameras, of identical or different definitions, having different optical viewing angles, and whose images are combined to generate the image displayed in the glasses 40 and if appropriate on the display device 22; The cropping and image correction processes according to the angles of the pilot's head relative to the console can be carried out in the drone, in the ground station, or in a distributed manner between the two; if the virtual reality glasses have digital processing means having the capacity, part of these treatments can also be supported by the glasses; - The virtual reality glasses may include a dedicated display system, or a commercial smart terminal such as a smartphone or tablet, reported on the glasses.
权利要求:
Claims (10)
[0001]
REVENDICATIONS1. Immersion drone piloting system comprising a drone (10) with camera means (14) and a ground station (16, 40) communicating with the drone via a wireless link, the ground station comprising a console (20) adapted to be directed towards the drone to maintain the quality of the wireless link and virtual reality glasses (40) rendering images taken using the means of shooting and transmitted from the drone by the wireless link, the system further comprising means for modifying the framing of the images taken by the shooting means (14) according to framing instructions received from the ground station, the system being suitable for characterized in that it further comprises: - relative heading determination means (302-324) for periodically developing an angular difference between the orientation of the glasses and the orientation of the console, and - means ( 316) to develop instructions for framing at the destination of the drone according to said angular difference.
[0002]
2. System according to claim 1, wherein the relative heading determination means comprises an inertial unit (38) associated with the ground station and processing means for determining a heading of the console relative to a reference.
[0003]
3. System according to claim 2, wherein the inertial unit (38) associated with the console comprises a magnetometer, and the reference is constituted by magnetic north.
[0004]
4. System according to claim 2, wherein the relative heading determining means comprise an inertial unit (42) associated with the glasses and processing means for determining an eyeglass cap relative to a reference. 3034681 13
[0005]
5. The system of claim 4, wherein the inertial unit (42) associated with the glasses comprises a magnetometer and the reference is constituted by the magnetic north. 5
[0006]
The system of claim 2, wherein the heading determining means comprises means (320) for estimating the heading of the spectacles by integrating speed and / or acceleration data generated in the spectacles, means initiator (322) for determining an offset between the console heading and the estimated goggle heading when a physical heading correspondence between the console and the goggles is initiated by an operator, and means (324) for determining a corrected heading of the glasses from the estimated heading and offset.
[0007]
The system of claim 2, wherein the relative heading determining means comprises means (314) for subtracting between the angular values of the console cap and the cap of spectacles.
[0008]
The system of claim 1, wherein the orientation of the console and the orientation of the spectacles comprise at least one yaw and / or pitch orientation.
[0009]
The system of claim 1, wherein the framing modifying means comprises digital processing means for reframing and correcting a fisheye fixed video image taken by the shooting means (14).
[0010]
10. System according to claim 1, further comprising means for superimposing on the image restored in the virtual reality glasses a heading indicator of the drone (CD).
类似技术:
公开号 | 公开日 | 专利标题
EP3078402B1|2017-10-04|System for piloting an fpv drone
EP3086195B1|2019-02-20|System for piloting a drone in first-person view mode
EP3025770B1|2017-01-25|Video system for piloting a drone in immersive mode
EP2933775B1|2016-12-28|Rotary-wing drone provided with a video camera supplying stabilised image sequences
EP3316068B1|2019-03-06|Autonomous system for collecting moving images by a drone with target tracking and upkeep of the target collection angle
EP3048789B1|2016-12-28|Drone provided with a video camera and means to compensate for the artefacts produced at the greatest roll angles
EP3142353B1|2019-12-18|Drone with forward-looking camera in which the control parameters, especially autoexposure, are made independent of the attitude
EP2613214B1|2017-08-02|Method for controlling a rotary-wing drone to operate photography by an on-board camera with minimisation of interfering movements
EP3069767B1|2017-09-13|Method for optimising the orientation of a remote-control device relative to a flying or wheeled drone
EP2613213A1|2013-07-10|Intuitive piloting method of a drone by means of a remote control device
EP3076258B1|2018-08-01|Drone provided with a video camera with compensated vertical focussing of instantaneous rotations for estimating horizontal speeds
EP2364757B1|2012-05-09|Method and device for remote control of a drone, in particular a rotary-wing drone
FR2972364A1|2012-09-14|METHOD FOR CONTROLLING FOLLOWING A CURVED TURNING OF A MULTI - ROTOR ROTOR SAILING DRONE.
EP3273318B1|2021-07-14|Autonomous system for collecting moving images by a drone with target tracking and improved target positioning
EP3306428A1|2018-04-11|Autonomous system for collecting moving images by a drone with target tracking and upkeep of the target collection angle
EP3273317A1|2018-01-24|Autonomous system for taking moving images, comprising a drone and a ground station, and associated method
EP3392728A1|2018-10-24|Method for piloting a rotary wing drone, related computer program, electronic apparatus and drone
WO2021117606A1|2021-06-17|Image processing device, system, image processing method and image processing program
FR3020168A1|2015-10-23|ROTATING WING DRONE WITH VIDEO CAMERA DELIVERING STABILIZED IMAGE SEQUENCES
FR3052678A1|2017-12-22|DRONE PROVIDED WITH A FRONTAL VIDEO CAMERA COMPRESSING THE INSTANTANEOUS ROTATIONS OF THE DRONE AND CORRECTION OF THE ARTIFACTS
同族专利:
公开号 | 公开日
EP3078402A1|2016-10-12|
CN106054870A|2016-10-26|
US20160297522A1|2016-10-13|
JP2016199261A|2016-12-01|
FR3034681B1|2017-05-12|
EP3078402B1|2017-10-04|
US9738382B2|2017-08-22|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
JP2003267295A|2002-03-14|2003-09-25|Foundation For Nara Institute Of Science & Technology|Remote operation system|
WO2010061099A2|2008-11-27|2010-06-03|Parrot|Device for piloting a drone|
EP2364757A1|2010-03-11|2011-09-14|Parrot|Method and device for remote control of a drone, in particular a rotary-wing drone|
CN102043410A|2010-09-30|2011-05-04|清华大学|Servo system for instructing pan-tilt system of unmanned aerial vehicle by adopting head movement of operator|
US8903568B1|2013-07-31|2014-12-02|SZ DJI Technology Co., Ltd|Remote control method and terminal|
US5240207A|1992-08-03|1993-08-31|The United States Of America As Represented By The Secretary Of The Navy|Generic drone control system|
US5610678A|1993-12-30|1997-03-11|Canon Kabushiki Kaisha|Camera including camera body and independent optical viewfinder|
US7312766B1|2000-09-22|2007-12-25|Canadian Space Agency|Method and system for time/motion compensation for head mounted displays|
US10627860B2|2011-05-10|2020-04-21|Kopin Corporation|Headset computer that uses motion and voice commands to control information display and remote devices|
EP2828148A4|2012-03-20|2015-12-09|Crane Cohasset Holdings Llc|Image monitoring and display from unmanned vehicle|
US9210413B2|2012-05-15|2015-12-08|Imagine Mobile Augmented Reality Ltd|System worn by a moving user for fully augmenting reality by anchoring virtual objects|
US9268136B1|2012-09-28|2016-02-23|Google Inc.|Use of comparative sensor data to determine orientation of head relative to body|
EP2933707B1|2014-04-14|2017-12-06|iOnRoad Technologies Ltd.|Head mounted display presentation adjustment|
KR102165450B1|2014-05-22|2020-10-14|엘지전자 주식회사|The Apparatus and Method for Portable Device controlling Unmanned Aerial Vehicle|
FR3035523B1|2015-04-23|2017-04-21|Parrot|IMMERSION DRONE DRIVING SYSTEM|US10666941B1|2016-04-06|2020-05-26|Ambarella International Lp|Low bitrate encoding of panoramic video to support live streaming over a wireless peer-to-peer connection|
US10474148B2|2016-07-27|2019-11-12|General Electric Company|Navigating an unmanned aerial vehicle|
US10179607B2|2016-08-03|2019-01-15|Aptiv Technologies Limited|Lane keeping system for autonomous vehicle in wind conditions using vehicle roll|
CN106357324A|2016-11-15|2017-01-25|广州亿航智能技术有限公司|Unmanned aerial vehicle system as well as communication connection method and communication connection apparatus thereof|
DE102016125459B3|2016-12-22|2018-05-03|Spintower Kg|Image acquisition method on an image capture system|
US10784719B2|2017-01-13|2020-09-22|Intel Corporation|Wirelessly powered unmanned aerial vehicles and tracks for providing wireless power|
CN106774435A|2017-01-20|2017-05-31|亿航智能设备(广州)有限公司|The high accuracy follower method and its system of a kind of unmanned plane|
US10687050B2|2017-03-10|2020-06-16|Qualcomm Incorporated|Methods and systems of reducing latency in communication of image data between devices|
WO2018185806A1|2017-04-03|2018-10-11|株式会社Orso|Information processing device, program and method|
CN106898187B|2017-04-17|2019-05-24|金陵科技学院|A kind of rotor wing unmanned aerial vehicle training platform that lands that takes off|
US10939038B2|2017-04-24|2021-03-02|Intel Corporation|Object pre-encoding for 360-degree view for optimal quality and latency|
JP2019040555A|2017-08-29|2019-03-14|ソニー株式会社|Information processing apparatus, information processing method, and program|
CN109076193A|2017-11-07|2018-12-21|深圳市大疆创新科技有限公司|Panoramic image display method and system, panoramic picture processing method and earth station|
CN108295469B|2017-12-04|2021-03-26|成都思悟革科技有限公司|Game visual angle conversion method based on motion capture technology|
US10390063B2|2017-12-22|2019-08-20|Comcast Cable Communications, Llc|Predictive content delivery for video streaming services|
US10798455B2|2017-12-22|2020-10-06|Comcast Cable Communications, Llc|Video delivery|
TWI655409B|2018-03-12|2019-04-01|國立高雄科技大學|Route planning method for aerial photography using multi-axis unmanned aerial vehicles|
DE102019100056A1|2019-01-03|2020-07-09|Manuel-René Gernaert|Control device for controlling real or virtual flying objects|
CN110531789B|2019-09-25|2022-01-25|成都飞机工业(集团)有限责任公司|Fixed wing unmanned aerial vehicle ground guide control handheld device|
法律状态:
2016-04-22| PLFP| Fee payment|Year of fee payment: 2 |
2016-10-14| PLSC| Search report ready|Effective date: 20161014 |
2017-04-18| PLFP| Fee payment|Year of fee payment: 3 |
2017-07-21| TP| Transmission of property|Owner name: PARROT DRONES, FR Effective date: 20170616 |
2018-04-23| PLFP| Fee payment|Year of fee payment: 4 |
2020-01-10| ST| Notification of lapse|Effective date: 20191206 |
优先权:
申请号 | 申请日 | 专利标题
FR1553091A|FR3034681B1|2015-04-10|2015-04-10|DRONE DRIVING SYSTEM IN IMMERSION.|FR1553091A| FR3034681B1|2015-04-10|2015-04-10|DRONE DRIVING SYSTEM IN IMMERSION.|
US15/087,825| US9738382B2|2015-04-10|2016-03-31|Drone immersion-piloting system|
EP16164302.8A| EP3078402B1|2015-04-10|2016-04-07|System for piloting an fpv drone|
CN201610217313.1A| CN106054870A|2015-04-10|2016-04-08|Drone immersion-piloting system|
JP2016077922A| JP2016199261A|2015-04-10|2016-04-08|Drone immersion type handling system|
[返回顶部]