Patents by Inventor Hayk Martirosyan
Hayk Martirosyan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12271208Abstract: A technique is introduced for touchdown detection during autonomous landing by an aerial vehicle. In some embodiments, the introduced technique includes processing perception inputs with a dynamics model of the aerial vehicle to estimate the external forces and/or torques acting on the aerial vehicle. The estimated external forces and/or torques are continually monitored while the aerial vehicle is landing to determine when the aerial vehicle is sufficiently supported by a landing surface. In some embodiments, semantic information associated with objects in the environment is utilized to configure parameters associated with the touchdown detection process.Type: GrantFiled: July 11, 2023Date of Patent: April 8, 2025Assignee: Skydio, Inc.Inventors: Rowland Wilde O'Flaherty, Teodor Tomic, Hayk Martirosyan, Abraham Galton Bachrach, Kristen Marie Holtz, Jack Louis Zhu
-
Patent number: 12271190Abstract: Technology for generating and displaying a graphical user interface for operating an unmanned aerial vehicle (UAV) is disclosed herein that generates and updates a representation of a spline flight path. In various implementations, a graphical user interface detects user interactions with a remote control device directing the flight control subsystem of the UAV to record keyframes and to compute a spline based on the keyframes during flight. The graphical user interface displays a real-time perspective of the UAV with a representation of the spline and the keyframes overlaying the view. The graphical user interface continually updates the representation as the UAV flies and when the spline is updated as the keyframes are updated.Type: GrantFiled: February 15, 2024Date of Patent: April 8, 2025Assignee: Skydio, Inc.Inventors: Matthew Thomas Beaudouin-Lafon, Saumya Pravinbhai Shah, Kristen Marie Holtz, James Anthony Ferrandini, Hayk Martirosyan, Matthew Joseph Donahoe, Charles VanSchoonhoven Wood, Clara Kelley, Adam Parker Bry, Jack Louis Zhu
-
Patent number: 12266131Abstract: Autonomous aerial navigation in low-light and no-light conditions includes using night mode obstacle avoidance intelligence and mechanisms for vision-based unmanned aerial vehicle (UAV) navigation to enable autonomous flight operations of a UAV in low-light and no-light environments using infrared data.Type: GrantFiled: October 19, 2021Date of Patent: April 1, 2025Assignee: Skydio, Inc.Inventors: Abraham Galton Bachrach, Adam Parker Bry, Gareth Benoit Cross, Peter Benjamin Henry, Kristen Marie Holtz, Ryan David Kennedy, Hayk Martirosyan, Vladimir Nekrasov, Samuel Shenghung Wang
-
Publication number: 20250093868Abstract: Technology for operating an unmanned aerial vehicle (UAV) is disclosed herein that allows a drone to be flown along a computed spline, while also accommodating in-flight modifications. In various implementations, a UAV includes a flight control subsystem and an electromechanical subsystem. The flight control subsystem records keyframes during flight and computes a spline based on the keyframes. The flight control subsystem then saves the computed spline for playback, at which time the UAV automatically flies in accordance with the computed spline.Type: ApplicationFiled: December 4, 2024Publication date: March 20, 2025Applicant: Skydio, Inc.Inventors: Saumya Pravinbhai Shah, Matthew Thomas Beaudouin-Lafon, Kristen Marie Holtz, James Anthony Ferrandini, Hayk Martirosyan, Matthew Joseph Donahoe, Charles VanSchoonhoven Wood, Clara Kelley, Adam Parker Bry, Jack Louis Zhu
-
Publication number: 20250095156Abstract: Semantic segmentation rendering is performed to encode structural data, including precise and relative component locations, in pixel values of an image depicting a structure. Images captured during a UAV-based exploration inspection of a structure are obtained. In each pixel value that corresponds to the structure within the images, identifiers of the structure, a component of the structure depicted using the pixel value, and a location of the component are encoded. The pixel values of the images are segmented into polygons according to the encoded identifiers, and data indicative of the polygons is stored for use in a further inspection of the structure. In connection with the semantic segmentation rendering, a three-dimensional graphical representation of the structure is obtained and rendered, according to the encoded identifiers, using shaders that visually distinguish each component of the structure, in which the data indicative of the polygons identifies respective ones of the shaders.Type: ApplicationFiled: September 20, 2024Publication date: March 20, 2025Inventors: Adam Parker Bry, Hayk Martirosyan, Vincent Lecrubier
-
Publication number: 20250094936Abstract: An unmanned aerial vehicle (UAV) performs operations to semantically understand components of a structure under inspection. During an exploration inspection of the structure, a camera of the UAV captures images of the structure. Components of the structure are determined based on the images and a taxonomy associated with the structure, for example, using a computer vision process and a machine learning model. A visual representation of the components (e.g., a semantic scene graph, such as a three-dimensional graphical representation of a hierarchical text representation) is generated and output to a user device in communication with the UAV to enable selections, via a graphical user interface output for display at the user device, of ones of the components for further inspection using the UAV.Type: ApplicationFiled: September 20, 2024Publication date: March 20, 2025Inventors: Adam Parker Bry, Hayk Martirosyan, Vincent Lecrubier
-
Publication number: 20250093882Abstract: Semantic three-dimensional scan is performed for the multi-phase inspection of a structure using an unmanned aerial vehicle (UAV). The multi-phase inspection includes a first inspection phase and a second inspection phase. A UAV performs the first inspection phase of the structure to determine a semantic understanding of components associated with the structure and pose information of the components. Based on the semantic understanding of the components and the pose information, a flight path indicating capture points and camera poses associated with the capture points is determined. The UAV then performs the second inspection phase of the structure according to the flight path, in which all or some of the components are inspected.Type: ApplicationFiled: September 20, 2024Publication date: March 20, 2025Inventors: Adam Parker Bry, Hayk Martirosyan, Vincent Lecrubier
-
Patent number: 12249139Abstract: Autonomous aerial navigation in low-light and no-light conditions includes using night mode obstacle avoidance intelligence, training, and mechanisms for vision-based unmanned aerial vehicle (UAV) navigation to enable autonomous flight operations of a UAV in low-light and no-light environments using infrared data.Type: GrantFiled: November 21, 2023Date of Patent: March 11, 2025Assignee: Skydio, Inc.Inventors: Samuel Shenghung Wang, Vladimir Nekrasov, Ryan David Kennedy, Gareth Benoit Cross, Peter Benjamin Henry, Kristen Marie Holtz, Hayk Martirosyan, Abraham Galton Bachrach, Adam Parker Bry
-
Patent number: 12189389Abstract: In some examples, one or more processors of an unmanned aerial vehicle (UAV), control a propulsion mechanism of the UAV to cause the UAV to navigate to a plurality of positions in relation to a scan target. Using one or more image sensors of the UAV, a first image of the scan target is captured from a first position of the plurality of positions, and a second image of the scan target is captured from a second position of the plurality of positions. A disparity is determined between the first image captured at the first position and the second image captured at the second position. A three-dimensional model corresponding to the scan target is determined based in part on the disparity determined between the first image and the second image.Type: GrantFiled: November 27, 2023Date of Patent: January 7, 2025Assignee: SKYDIO, INC.Inventors: Peter Henry, Jack Zhu, Brian Richman, Harrison Zheng, Hayk Martirosyan, Matthew Donahoe, Abraham Bachrach, Adam Bry, Ryan David Kennedy, Himel Mondal, Quentin Allen Wah Yen Delepine
-
Patent number: 12175878Abstract: A technique for user interaction with an autonomous unmanned aerial vehicle (UAV) is described. In an example embodiment, perception inputs from one or more sensor devices are processed to build a shared virtual environment that is representative of a physical environment. The sensor devices used to generate perception inputs can include image capture devices onboard an autonomous aerial vehicle that is in flight through the physical environment. The shared virtual environment can provide a continually updated representation of the physical environment which is accessible to multiple network-connected devices, including multiple UAVs and multiple mobile computing devices. The shared virtual environment can be used, for example, to display visual augmentations at network-connected user devices and guide autonomous navigation by the UAV.Type: GrantFiled: July 13, 2023Date of Patent: December 24, 2024Assignee: Skydio, Inc.Inventors: Abraham Galton Bachrach, Adam Parker Bry, Matthew Joseph Donahoe, Hayk Martirosyan
-
Patent number: 12169404Abstract: Technology for operating an unmanned aerial vehicle (UAV) is disclosed herein that allows a drone to be flown along a computed spline, while also accommodating in-flight modifications. In various implementations, a UAV includes a flight control subsystem and an electromechanical subsystem. The flight control subsystem records keyframes during flight and computes a spline based on the keyframes. The flight control subsystem then saves the computed spline for playback, at which time the UAV automatically flies in accordance with the computed spline.Type: GrantFiled: March 8, 2022Date of Patent: December 17, 2024Assignee: Skydio, Inc.Inventors: Saumya Pravinbhai Shah, Matthew Thomas Beaudouin-Lafon, Kristen Marie Holtz, James Anthony Ferrandini, Hayk Martirosyan, Matthew Joseph Donahoe, Charles Vanschoonhoven Wood, Clara Kelley, Adam Parker Bry, Jack Louis Zhu
-
Patent number: 12169406Abstract: In some examples, one or more processors of an aerial vehicle access a scan plan including a sequence of poses for the aerial vehicle to assume to capture, using the one or more image sensors, images of a scan target. A next pose of the scan plan is checked for obstructions, and based at least on detection of an obstruction, the one or more processors determine whether a backup pose is available for capturing an image of the targeted point orthogonally along a normal of the targeted point. Responsive to determining that the backup pose is unavailable for capturing an image of the targeted point orthogonally along the normal of the targeted point, image capture of the targeted point is performed at an oblique angle to the normal of the targeted point.Type: GrantFiled: November 27, 2023Date of Patent: December 17, 2024Assignee: SKYDIO, INC.Inventors: Peter Henry, Jack Zhu, Brian Richman, Harrison Zheng, Hayk Martirosyan, Matthew Donahoe, Abraham Bachrach, Adam Bry, Ryan David Kennedy, Himel Mondal, Quentin Allen Wah Yen Delepine
-
Patent number: 12148205Abstract: In some examples, an unmanned aerial vehicle (UAV) may determine a plurality of contour paths spaced apart from each other along at least one axis associated with a scan target. For instance, each contour path may be spaced away from a surface of the scan target based on a selected distance. The UAV may determine a plurality of image capture locations for each contour path. The image capture locations may indicate locations at which an image of a surface of the scan target is to be captured. The UAV may navigate along the plurality of contour paths based on a determined speed while capturing images of the surface of the scan target based on the image capture locations.Type: GrantFiled: November 10, 2021Date of Patent: November 19, 2024Assignee: SKYDIO, INC.Inventors: Peter Benjamin Henry, Hayk Martirosyan, Quentin Allen Wah Yen Delepine, Himel Mondal, Abraham Galton Bachrach
-
Patent number: 12097957Abstract: Described herein are systems and methods for structure scan using an unmanned aerial vehicle. For example, some methods include accessing a three-dimensional map of a structure; generating facets based on the three-dimensional map, wherein the facets are respectively a polygon on a plane in three-dimensional space that is fit to a subset of the points in the three-dimensional map; generating a scan plan based on the facets, wherein the scan plan includes a sequence of poses for an unmanned aerial vehicle to assume to enable capture, using image sensors of the unmanned aerial vehicle, of images of the structure; causing the unmanned aerial vehicle to fly to assume a pose corresponding to one of the sequence of poses of the scan plan; and capturing one or more images of the structure from the pose.Type: GrantFiled: August 18, 2022Date of Patent: September 24, 2024Assignee: Skydio, Inc.Inventors: Peter Henry, Jack Zhu, Brian Richman, Harrison Zheng, Hayk Martirosyan, Matthew Donahoe, Abraham Galton Bachrach, Adam Bry
-
Publication number: 20240310834Abstract: In some examples, one or more processors of an aerial vehicle access a scan plan including a sequence of poses for the aerial vehicle to assume to capture, using the one or more image sensors, images of a scan target. A next pose of the scan plan is checked for obstructions, and based at least on detection of an obstruction, the one or more processors determine whether a backup pose is available for capturing an image of the targeted point orthogonally along a normal of the targeted point. Responsive to determining that the backup pose is unavailable for capturing an image of the targeted point orthogonally along the normal of the targeted point, image capture of the targeted point is performed at an oblique angle to the normal of the targeted point.Type: ApplicationFiled: November 27, 2023Publication date: September 19, 2024Inventors: Peter HENRY, Jack ZHU, Brian RICHMAN, Harrison ZHENG, Hayk MARTIROSYAN, Matthew DONAHOE, Abraham BACHRACH, Adam BRY, Ryan David KENNEDY, Himel MONDAL, Quentin Allen Wah Yen DELEPINE
-
Publication number: 20240295876Abstract: In some examples, one or more processors of an unmanned aerial vehicle (UAV), control a propulsion mechanism of the UAV to cause the UAV to navigate to a plurality of positions in relation to a scan target. Using one or more image sensors of the UAV, a first image of the scan target is captured from a first position of the plurality of positions, and a second image of the scan target is captured from a second position of the plurality of positions. A disparity is determined between the first image captured at the first position and the second image captured at the second position. A three-dimensional model corresponding to the scan target is determined based in part on the disparity determined between the first image and the second image.Type: ApplicationFiled: November 27, 2023Publication date: September 5, 2024Inventors: Peter HENRY, Jack ZHU, Brian RICHMAN, Harrison ZHENG, Hayk MARTIROSYAN, Matthew DONAHOE, Abraham BACHRACH, Adam BRY, Ryan David KENNEDY, Himei MONDAL, Quentin Allen Wah Yen DELEPINE
-
Publication number: 20240278912Abstract: Described herein are systems for roof scan using an unmanned aerial vehicle. For example, some methods include capturing, using an unmanned aerial vehicle, an overview image of a roof of a building from above the roof; presenting a suggested bounding polygon overlaid on the overview image to a user; determining a bounding polygon based on the suggested bounding polygon and user edits; based on the bounding polygon, determining a flight path including a sequence of poses of the unmanned aerial vehicle with respective fields of view at a fixed height that collectively cover the bounding polygon; fly the unmanned aerial vehicle to a sequence of scan poses with horizontal positions matching respective poses of the flight path and vertical positions determined to maintain a consistent distance above the roof; and scanning the roof from the sequence of scan poses to generate a three-dimensional map of the roof.Type: ApplicationFiled: February 7, 2024Publication date: August 22, 2024Inventors: Peter Henry, Jack Zhu, Brian Richman, Harrison Zheng, Hayk Martirosyan, Matthew Donahoe, Abraham Galton Bachrach, Adam Bry
-
Publication number: 20240273894Abstract: Systems and methods are disclosed for tracking objects in a physical environment using visual sensors onboard an autonomous unmanned aerial vehicle (UAV). In certain embodiments, images of the physical environment captured by the onboard visual sensors are processed to extract semantic information about detected objects. Processing of the captured images may involve applying machine learning techniques such as a deep convolutional neural network to extract semantic cues regarding objects detected in the images. The object tracking can be utilized, for example, to facilitate autonomous navigation by the UAV or to generate and display augmentative information regarding tracked objects to users.Type: ApplicationFiled: December 29, 2023Publication date: August 15, 2024Applicant: Skydio, Inc.Inventors: Saumitro Dasgupta, Hayk Martirosyan, Hema Koppula, Alex Kendall, Austin Stone, Matthew Donahoe, Abraham Galton Bachrach, Adam Parker Bry
-
Publication number: 20240255943Abstract: Technology for generating and displaying a graphical user interface for operating an unmanned aerial vehicle (UAV) is disclosed herein that generates and updates a representation of a spline flight path. In various implementations, a graphical user interface detects user interactions with a remote control device directing the flight control subsystem of the UAV to record keyframes and to compute a spline based on the keyframes during flight. The graphical user interface displays a real-time perspective of the UAV with a representation of the spline and the keyframes overlaying the view. The graphical user interface continually updates the representation as the UAV flies and when the spline is updated as the keyframes are updated.Type: ApplicationFiled: February 15, 2024Publication date: August 1, 2024Applicant: Skydio, Inc.Inventors: Matthew Thomas Beaudouin-Lafon, Saumya Pravinbhai Shah, Kristen Marie Holtz, James Anthony Ferrandini, Hayk Martirosyan, Matthew Joseph Donahoe, Charles VanSchoonhoven Wood, Clara Kelley, Adam Parker Bry, Jack Louis Zhu
-
Publication number: 20240228035Abstract: An autonomous vehicle that is equipped with image capture devices can use information gathered from the image capture devices to plan a future three-dimensional (3D) trajectory through a physical environment. To this end, a technique is described for image-space based motion planning. In an embodiment, a planned 3D trajectory is projected into an image-space of an image captured by the autonomous vehicle. The planned 3D trajectory is then optimized according to a cost function derived from information (e.g., depth estimates) in the captured image. The cost function associates higher cost values with identified regions of the captured image that are associated with areas of the physical environment into which travel is risky or otherwise undesirable. The autonomous vehicle is thereby encouraged to avoid these areas while satisfying other motion planning objectives.Type: ApplicationFiled: September 8, 2023Publication date: July 11, 2024Applicant: Skydio, Inc.Inventors: Ryan David KENNEDY, Peter Benjamin HENRY, Hayk MARTIROSYAN, Jack Louis ZHU, Abraham Galton BACHRACH, Adam Parker BRY